WorldWideScience

Sample records for high-throughput quantum chemistry

  1. Turbocharged molecular discovery of OLED emitters: from high-throughput quantum simulation to highly efficient TADF devices

    Science.gov (United States)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.

    2016-09-01

    Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.

  2. Exploiting Locality in Quantum Computation for Quantum Chemistry.

    Science.gov (United States)

    McClean, Jarrod R; Babbush, Ryan; Love, Peter J; Aspuru-Guzik, Alán

    2014-12-18

    Accurate prediction of chemical and material properties from first-principles quantum chemistry is a challenging task on traditional computers. Recent developments in quantum computation offer a route toward highly accurate solutions with polynomial cost; however, this solution still carries a large overhead. In this Perspective, we aim to bring together known results about the locality of physical interactions from quantum chemistry with ideas from quantum computation. We show that the utilization of spatial locality combined with the Bravyi-Kitaev transformation offers an improvement in the scaling of known quantum algorithms for quantum chemistry and provides numerical examples to help illustrate this point. We combine these developments to improve the outlook for the future of quantum chemistry on quantum computers.

  3. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  4. Quantum chemistry

    CERN Document Server

    Lowe, John P

    1993-01-01

    Praised for its appealing writing style and clear pedagogy, Lowe's Quantum Chemistry is now available in its Second Edition as a text for senior undergraduate- and graduate-level chemistry students. The book assumes little mathematical or physical sophistication and emphasizes an understanding of the techniques and results of quantum chemistry, thus enabling students to comprehend much of the current chemical literature in which quantum chemical methods or concepts are used as tools. The book begins with a six-chapter introduction of standard one-dimensional systems, the hydrogen atom,

  5. Life in the fast lane: high-throughput chemistry for lead generation and optimisation.

    Science.gov (United States)

    Hunter, D

    2001-01-01

    The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.

  6. Enzyme-Initiated Quinone-Chitosan Conjugation Chemistry: Toward A General in Situ Strategy for High-Throughput Photoelectrochemical Enzymatic Bioanalysis.

    Science.gov (United States)

    Wang, Guang-Li; Yuan, Fang; Gu, Tiantian; Dong, Yuming; Wang, Qian; Zhao, Wei-Wei

    2018-02-06

    Herein we report a general and novel strategy for high-throughput photoelectrochemical (PEC) enzymatic bioanalysis on the basis of enzyme-initiated quinone-chitosan conjugation chemistry (QCCC). Specifically, the strategy was illustrated by using a model quinones-generating oxidase of tyrosinase (Tyr) to catalytically produce 1,2-bezoquinone or its derivative, which can easily and selectively be conjugated onto the surface of the chitosan deposited PbS/NiO/FTO photocathode via the QCCC. Upon illumination, the covalently attached quinones could act as electron acceptors of PbS quantum dots (QDs), improving the photocurrent generation and thus allowing the elegant probing of Tyr activity. Enzyme cascades, such as alkaline phosphatase (ALP)/Tyr and β-galactosidase (Gal)/Tyr, were further introduced into the system for the successful probing of the corresponding targets. This work features not only the first use of QCCC in PEC bioanalysis but also the separation of enzymatic reaction from the photoelectrode as well as the direct signal recording in a split-type protocol, which enables quite convenient and high-throughput detection as compared to previous formats. More importantly, by using numerous other oxidoreductases that involve quinones as reactants/products, this protocol could serve as a common basis for the development of a new class of QCCC-based PEC enzymatic bioanalysis and further extended for general enzyme-labeled PEC bioanalysis of versatile targets.

  7. Handbook of relativistic quantum chemistry

    International Nuclear Information System (INIS)

    Liu, Wenjian

    2017-01-01

    This handbook focuses on the foundations of relativistic quantum mechanics and addresses a number of fundamental issues never covered before in a book. For instance: How can many-body theory be combined with quantum electrodynamics? How can quantum electrodynamics be interfaced with relativistic quantum chemistry? What is the most appropriate relativistic many-electron Hamiltonian? How can we achieve relativistic explicit correlation? How can we formulate relativistic properties? - just to name a few. Since relativistic quantum chemistry is an integral component of computational chemistry, this handbook also supplements the ''Handbook of Computational Chemistry''. Generally speaking, it aims to establish the 'big picture' of relativistic molecular quantum mechanics as the union of quantum electrodynamics and relativistic quantum chemistry. Accordingly, it provides an accessible introduction for readers new to the field, presents advanced methodologies for experts, and discusses possible future perspectives, helping readers understand when/how to apply/develop the methodologies.

  8. Handbook of relativistic quantum chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Wenjian (ed.) [Peking Univ., Beijing (China). Center for Computational Science and Engineering

    2017-03-01

    This handbook focuses on the foundations of relativistic quantum mechanics and addresses a number of fundamental issues never covered before in a book. For instance: How can many-body theory be combined with quantum electrodynamics? How can quantum electrodynamics be interfaced with relativistic quantum chemistry? What is the most appropriate relativistic many-electron Hamiltonian? How can we achieve relativistic explicit correlation? How can we formulate relativistic properties? - just to name a few. Since relativistic quantum chemistry is an integral component of computational chemistry, this handbook also supplements the ''Handbook of Computational Chemistry''. Generally speaking, it aims to establish the 'big picture' of relativistic molecular quantum mechanics as the union of quantum electrodynamics and relativistic quantum chemistry. Accordingly, it provides an accessible introduction for readers new to the field, presents advanced methodologies for experts, and discusses possible future perspectives, helping readers understand when/how to apply/develop the methodologies.

  9. Second quantized approach to quantum chemistry

    International Nuclear Information System (INIS)

    Surjan, P.R.

    1989-01-01

    The subject of this book is the application of the second quantized approach to quantum chemistry. Second quantization is an alternative tool for dealing with many-electron theory. The vast majority of quantum chemical problems are more easily treated using second quantization as a language. This book offers a simple and pedagogical presentation of the theory and some applications. The reader is not supposed to be trained in higher mathematics, though familiarity with elementary quantum mechanics and quantum chemistry is assumed. Besides the basic formalism and standard illustrative applications, some recent topics of quantum chemistry are reviewed in some detail. This book bridges the gap between sophisticated quantum theory and practical quantum chemistry. (orig.)

  10. Introductory quantum chemistry

    International Nuclear Information System (INIS)

    Chandra, A.K.

    1974-01-01

    This book on quantum chemistry is primarily intended for university students at the senior undergraduate level. It serves as an aid to the basic understanding of the important concepts of quantum mechanics introduced in the field of chemistry. Various chapters of the book are devoted to the following : (i) Waves and quanta, (ii) Operator concept in quantum chemistry, (iii) Wave mechanics of some simple systems, (iv) Perturbation theory, (v) Many-electron atoms and angular momenta (vi) Molecular orbital theory and its application to the electronic structure of diatomic molecules, (vii) Chemical bonding in polyatomic molecules and (viii) Chemical applications of Hellmann-Feynman theorem. At the end of each chapter, a set of problems is given and the answers to these problems are given at the end of the book. (A.K.)

  11. Advances in quantum chemistry

    CERN Document Server

    Sabin, John R

    2013-01-01

    Advances in Quantum Chemistry presents surveys of current topics in this rapidly developing field that has emerged at the cross section of the historically established areas of mathematics, physics, chemistry, and biology. It features detailed reviews written by leading international researchers. This volume focuses on the theory of heavy ion physics in medicine.Advances in Quantum Chemistry presents surveys of current topics in this rapidly developing field that has emerged at the cross section of the historically established areas of mathematics, physics, chemistry, and biology. It features

  12. Relativistic quantum chemistry on quantum computers

    DEFF Research Database (Denmark)

    Veis, L.; Visnak, J.; Fleig, T.

    2012-01-01

    The past few years have witnessed a remarkable interest in the application of quantum computing for solving problems in quantum chemistry more efficiently than classical computers allow. Very recently, proof-of-principle experimental realizations have been reported. However, so far only...... the nonrelativistic regime (i.e., the Schrodinger equation) has been explored, while it is well known that relativistic effects can be very important in chemistry. We present a quantum algorithm for relativistic computations of molecular energies. We show how to efficiently solve the eigenproblem of the Dirac......-Coulomb Hamiltonian on a quantum computer and demonstrate the functionality of the proposed procedure by numerical simulations of computations of the spin-orbit splitting in the SbH molecule. Finally, we propose quantum circuits with three qubits and nine or ten controlled-NOT (CNOT) gates, which implement a proof...

  13. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  14. Computational quantum chemistry website

    International Nuclear Information System (INIS)

    1997-01-01

    This report contains the contents of a web page related to research on the development of quantum chemistry methods for computational thermochemistry and the application of quantum chemistry methods to problems in material chemistry and chemical sciences. Research programs highlighted include: Gaussian-2 theory; Density functional theory; Molecular sieve materials; Diamond thin-film growth from buckyball precursors; Electronic structure calculations on lithium polymer electrolytes; Long-distance electronic coupling in donor/acceptor molecules; and Computational studies of NOx reactions in radioactive waste storage

  15. Computing protein infrared spectroscopy with quantum chemistry.

    Science.gov (United States)

    Besley, Nicholas A

    2007-12-15

    Quantum chemistry is a field of science that has undergone unprecedented advances in the last 50 years. From the pioneering work of Boys in the 1950s, quantum chemistry has evolved from being regarded as a specialized and esoteric discipline to a widely used tool that underpins much of the current research in chemistry today. This achievement was recognized with the award of the 1998 Nobel Prize in Chemistry to John Pople and Walter Kohn. As the new millennium unfolds, quantum chemistry stands at the forefront of an exciting new era. Quantitative calculations on systems of the magnitude of proteins are becoming a realistic possibility, an achievement that would have been unimaginable to the early pioneers of quantum chemistry. In this article we will describe ongoing work towards this goal, focusing on the calculation of protein infrared amide bands directly with quantum chemical methods.

  16. Quantum chemistry on a superconducting quantum processor

    Energy Technology Data Exchange (ETDEWEB)

    Kaicher, Michael P.; Wilhelm, Frank K. [Theoretical Physics, Saarland University, 66123 Saarbruecken (Germany); Love, Peter J. [Department of Physics and Astronomy, Tufts University, Medford, MA 02155 (United States)

    2016-07-01

    Quantum chemistry is the most promising civilian application for quantum processors to date. We study its adaptation to superconducting (sc) quantum systems, computing the ground state energy of LiH through a variational hybrid quantum classical algorithm. We demonstrate how interactions native to sc qubits further reduce the amount of quantum resources needed, pushing sc architectures as a near-term candidate for simulations of more complex atoms/molecules.

  17. Principles of quantum chemistry

    CERN Document Server

    George, David V

    2013-01-01

    Principles of Quantum Chemistry focuses on the application of quantum mechanics in physical models and experiments of chemical systems.This book describes chemical bonding and its two specific problems - bonding in complexes and in conjugated organic molecules. The very basic theory of spectroscopy is also considered. Other topics include the early development of quantum theory; particle-in-a-box; general formulation of the theory of quantum mechanics; and treatment of angular momentum in quantum mechanics. The examples of solutions of Schroedinger equations; approximation methods in quantum c

  18. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  19. Per-Olov Löwdin - father of quantum chemistry

    Science.gov (United States)

    Brändas, Erkki J.

    2017-09-01

    During 2016, we celebrate the 100th anniversary of the birth of Per-Olov Löwdin. He was appointed to the first Lehrstuhl in quantum chemistry at Uppsala University in 1960. Löwdin introduced quantum chemistry as a field in its own right by formulating its goals, establishing fundamental concepts, like the correlation energy, the method of configuration interaction, reduced density matrices, natural spin orbitals, charge and bond order matrices, symmetric orthogonalisation, and generalised self-consistent fields. His exposition of partitioning technique and perturbation theory, wave and reaction operators and associated non-linear summation techniques, introduced mathematical rigour and deductive order in the interpretative organisation of the new field. He brought the first computer to Uppsala University and pioneered the initiation of 'electronic brains' and anticipated their significance for quantum chemistry. Perhaps his single most influential contribution to the field was his education of two generations of future faculty in quantum chemistry through Summer Schools in the Scandinavian Mountains, Winter Institutes at Sanibel Island in the Gulf of Mexico. Per-Olov Löwdin founded the book series Advances in Quantum Chemistry and the International Journal of Quantum Chemistry. The evolution of quantum chemistry is appraised, starting from a collection of cross-disciplinary applications of quantum mechanics to the technologically advanced and predominant field of today, virtually used in all branches of chemistry. The scientific work of Per-Olov Löwdin has been crucial for the development of this new important province of science.

  20. Development of tight-binding, chemical-reaction-dynamics simulator for combinatorial computational chemistry

    International Nuclear Information System (INIS)

    Kubo, Momoji; Ando, Minako; Sakahara, Satoshi; Jung, Changho; Seki, Kotaro; Kusagaya, Tomonori; Endou, Akira; Takami, Seiichi; Imamura, Akira; Miyamoto, Akira

    2004-01-01

    Recently, we have proposed a new concept called 'combinatorial computational chemistry' to realize a theoretical, high-throughput screening of catalysts and materials. We have already applied our combinatorial, computational-chemistry approach, mainly based on static first-principles calculations, to various catalysts and materials systems and its applicability to the catalysts and materials design was strongly confirmed. In order to realize more effective and efficient combinatorial, computational-chemistry screening, a high-speed, chemical-reaction-dynamics simulator based on quantum-chemical, molecular-dynamics method is essential. However, to the best of our knowledge, there is no chemical-reaction-dynamics simulator, which has an enough high-speed ability to perform a high-throughput screening. In the present study, we have succeeded in the development of a chemical-reaction-dynamics simulator based on our original, tight-binding, quantum-chemical, molecular-dynamics method, which is more than 5000 times faster than the regular first-principles, molecular-dynamics method. Moreover, its applicability and effectiveness to the atomistic clarification of the methanol-synthesis dynamics at reaction temperature were demonstrated

  1. Quantum chemistry an introduction

    CERN Document Server

    Kauzmann, Walter

    2013-01-01

    Quantum Chemistry: An Introduction provides information pertinent to the fundamental aspects of quantum mechanics. This book presents the theory of partial differentiation equations by using the classical theory of vibrations as a means of developing physical insight into this essential branch of mathematics.Organized into five parts encompassing 16 chapters, this book begins with an overview of how quantum mechanical deductions are made. This text then describes the achievements and limitations of the application of quantum mechanics to chemical problems. Other chapters provide a brief survey

  2. Quantum mechanics in chemistry

    CERN Document Server

    Schatz, George C

    2002-01-01

    Intended for graduate and advanced undergraduate students, this text explores quantum mechanical techniques from the viewpoint of chemistry and materials science. Dynamics, symmetry, and formalism are emphasized. An initial review of basic concepts from introductory quantum mechanics is followed by chapters examining symmetry, rotations, and angular momentum addition. Chapter 4 introduces the basic formalism of time-dependent quantum mechanics, emphasizing time-dependent perturbation theory and Fermi's golden rule. Chapter 5 sees this formalism applied to the interaction of radiation and matt

  3. High-throughput computational search for strengthening precipitates in alloys

    International Nuclear Information System (INIS)

    Kirklin, S.; Saal, James E.; Hegde, Vinay I.; Wolverton, C.

    2016-01-01

    The search for high-strength alloys and precipitation hardened systems has largely been accomplished through Edisonian trial and error experimentation. Here, we present a novel strategy using high-throughput computational approaches to search for promising precipitate/alloy systems. We perform density functional theory (DFT) calculations of an extremely large space of ∼200,000 potential compounds in search of effective strengthening precipitates for a variety of different alloy matrices, e.g., Fe, Al, Mg, Ni, Co, and Ti. Our search strategy involves screening phases that are likely to produce coherent precipitates (based on small lattice mismatch) and are composed of relatively common alloying elements. When combined with the Open Quantum Materials Database (OQMD), we can computationally screen for precipitates that either have a stable two-phase equilibrium with the host matrix, or are likely to precipitate as metastable phases. Our search produces (for the structure types considered) nearly all currently known high-strength precipitates in a variety of fcc, bcc, and hcp matrices, thus giving us confidence in the strategy. In addition, we predict a number of new, currently-unknown precipitate systems that should be explored experimentally as promising high-strength alloy chemistries.

  4. Remedial mathematics for quantum chemistry

    NARCIS (Netherlands)

    Koopman, L.; Brouwer, N.; Heck, A.; Buma, W.J.

    2008-01-01

    Proper mathematical skills are important for every science course and mathematics-intensive chemistry courses rely on a sound mathematical pre-knowledge. In the first-year quantum chemistry course at this university, it was noticed that many students lack basic mathematical knowledge. To tackle the

  5. From wave mechanics to quantum chemistry

    International Nuclear Information System (INIS)

    Daudel, R.

    1996-01-01

    The origin of wave mechanics, which is now called quantum mechanics, is evoked. The main stages of the birth of quantum chemistry are related as resulting from the application of quantum mechanics to the study of molecular properties and chemical reactions. (author). 14 refs

  6. A Quantum Chemistry Concept Inventory for Physical Chemistry Classes

    Science.gov (United States)

    Dick-Perez, Marilu; Luxford, Cynthia J.; Windus, Theresa L.; Holme, Thomas

    2016-01-01

    A 14-item, multiple-choice diagnostic assessment tool, the quantum chemistry concept inventory or QCCI, is presented. Items were developed based on published student misconceptions and content coverage and then piloted and used in advanced physical chemistry undergraduate courses. In addition to the instrument itself, data from both a pretest,…

  7. Using high throughput experimental data and in silico models to discover alternatives to toxic chromate corrosion inhibitors

    International Nuclear Information System (INIS)

    Winkler, D.A.; Breedon, M.; White, P.; Hughes, A.E.; Sapper, E.D.; Cole, I.

    2016-01-01

    Highlights: • We screened a large library of organic compounds as replacements for toxic chromates. • High throughput automated corrosion testing was used to assess inhibitor performance. • Robust, predictive machine learning models of corrosion inhibition were developed. • Models indicated molecular features contributing to performance of organic inhibitors. • We also showed that quantum chemistry descriptors do not correlate with performance. - Abstract: Restrictions on the use of toxic chromate-based corrosion inhibitors have created important issues for the aerospace and other industries. Benign alternatives that offer similar or superior performance are needed. We used high throughput experiments to assess 100 small organic molecules as potential inhibitors of corrosion in aerospace aluminium alloys AA2024 and AA7075. We generated robust, predictive, quantitative computational models of inhibitor efficiency at two pH values using these data. The models identified molecular features of inhibitor molecules that had the greatest impact on corrosion inhibition. Models can be used to discover better corrosion inhibitors by screening libraries of organic compounds for candidates with high corrosion inhibition.

  8. Introducing Relativity into Quantum Chemistry

    Science.gov (United States)

    Li, Wai-Kee; Blinder, S. M.

    2011-01-01

    It is not often realized by chemists that the special theory of relativity is behind several aspects of quantum chemistry. The Schrdinger equation itself is based on relations between space-time and energy-momentum four vectors. Electron spin is, of course, the most obvious manifestation of relativity. The chemistry of some heavy elements is…

  9. High-throughput quantum chemistry and virtual screening for OLED material components

    Science.gov (United States)

    Halls, Mathew D.; Giesen, David J.; Hughes, Thomas F.; Goldberg, Alexander; Cao, Yixiang

    2013-09-01

    Computational structure enumeration, analysis using an automated simulation workflow and filtering of large chemical structure libraries to identify lead systems, has become a central paradigm in drug discovery research. Transferring this paradigm to challenges in materials science is now possible due to advances in the speed of computational resources and the efficiency and stability of chemical simulation packages. State-of-the-art software tools that have been developed for drug discovery can be applied to efficiently explore the chemical design space to identify solutions for problems such as organic light-emitting diode material components. In this work, virtual screening for OLED materials based on intrinsic quantum mechanical properties is illustrated. Also, a new approach to more reliably identify candidate systems is introduced that is based on the chemical reaction energetics of defect pathways for OLED materials.

  10. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  11. Fundamentals of quantum chemistry

    CERN Document Server

    House, J E

    2004-01-01

    An introduction to the principles of quantum mechanics needed in physical chemistry. Mathematical tools are presented and developed as needed and only basic calculus, chemistry, and physics is assumed. Applications include atomic and molecular structure, spectroscopy, alpha decay, tunneling, and superconductivity. New edition includes sections on perturbation theory, orbital symmetry of diatomic molecules, the Huckel MO method and Woodward/Hoffman rules as well as a new chapter on SCF and Hartree-Fock methods. * This revised text clearly presents basic q

  12. From transistor to trapped-ion computers for quantum chemistry.

    Science.gov (United States)

    Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E

    2014-01-07

    Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.

  13. Virtually going green: The role of quantum computational chemistry in reducing pollution and toxicity in chemistry

    Science.gov (United States)

    Stevens, Jonathan

    2017-07-01

    Continuing advances in computational chemistry has permitted quantum mechanical calculation to assist in research in green chemistry and to contribute to the greening of chemical practice. Presented here are recent examples illustrating the contribution of computational quantum chemistry to green chemistry, including the possibility of using computation as a green alternative to experiments, but also illustrating contributions to greener catalysis and the search for greener solvents. Examples of applications of computation to ambitious projects for green synthetic chemistry using carbon dioxide are also presented.

  14. Learning Quantum Chemistry via a Visual-Conceptual Approach: Students' Bidirectional Textual and Visual Understanding

    Science.gov (United States)

    Dangur, Vered; Avargil, Shirly; Peskin, Uri; Dori, Yehudit Judy

    2014-01-01

    Most undergraduate chemistry courses and a few high school honors courses, which focus on physical chemistry and quantum mechanics, are highly mathematically-oriented. At the Technion, Israel Institute of Technology, we developed a new module for high school students, titled "Chemistry--From 'the Hole' to 'the Whole': From the Nanoscale to…

  15. Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.

    Science.gov (United States)

    Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán

    2014-03-11

    While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.

  16. Quantum chemistry in environmental pesticide risk assessment.

    Science.gov (United States)

    Villaverde, Juan J; López-Goti, Carmen; Alcamí, Manuel; Lamsabhi, Al Mokhtar; Alonso-Prados, José L; Sandín-España, Pilar

    2017-11-01

    The scientific community and regulatory bodies worldwide, currently promote the development of non-experimental tests that produce reliable data for pesticide risk assessment. The use of standard quantum chemistry methods could allow the development of tools to perform a first screening of compounds to be considered for the experimental studies, improving the risk assessment. This fact results in a better distribution of resources and in better planning, allowing a more exhaustive study of the pesticides and their metabolic products. The current paper explores the potential of quantum chemistry in modelling toxicity and environmental behaviour of pesticides and their by-products by using electronic descriptors obtained computationally. Quantum chemistry has potential to estimate the physico-chemical properties of pesticides, including certain chemical reaction mechanisms and their degradation pathways, allowing modelling of the environmental behaviour of both pesticides and their by-products. In this sense, theoretical methods can contribute to performing a more focused risk assessment of pesticides used in the market, and may lead to higher quality and safer agricultural products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  17. Handbook of computational quantum chemistry

    CERN Document Server

    Cook, David B

    2005-01-01

    Quantum chemistry forms the basis of molecular modeling, a tool widely used to obtain important chemical information and visual images of molecular systems. Recent advances in computing have resulted in considerable developments in molecular modeling, and these developments have led to significant achievements in the design and synthesis of drugs and catalysts. This comprehensive text provides upper-level undergraduates and graduate students with an introduction to the implementation of quantum ideas in molecular modeling, exploring practical applications alongside theoretical explanations.Wri

  18. A microliter-scale high-throughput screening system with quantum-dot nanoprobes for amyloid-β aggregation inhibitors.

    Directory of Open Access Journals (Sweden)

    Yukako Ishigaki

    Full Text Available The aggregation of amyloid β protein (Aβ is a key step in the pathogenesis of Alzheimer's disease (AD, and therefore inhibitory substances for Aβ aggregation may have preventive and/or therapeutic potential for AD. Here we report a novel microliter-scale high-throughput screening system for Aβ aggregation inhibitors based on fluorescence microscopy-imaging technology with quantum-dot Nanoprobes. This screening system could be analyzed with a 5-µl sample volume when a 1536-well plate was used, and the inhibitory activity could be estimated as half-maximal effective concentrations (EC50. We attempted to comprehensively screen Aβ aggregation inhibitors from 52 spices using this system to assess whether this novel screening system is actually useful for screening inhibitors. Screening results indicate that approximately 90% of the ethanolic extracts from the spices showed inhibitory activity for Aβ aggregation. Interestingly, spices belonging to the Lamiaceae, the mint family, showed significantly higher activity than the average of tested spices. Furthermore, we tried to isolate the main inhibitory compound from Saturejahortensis, summer savory, a member of the Lamiaceae, using this system, and revealed that the main active compound was rosmarinic acid. These results demonstrate that this novel microliter-scale high-throughput screening system could be applied to the actual screening of Aβ aggregation inhibitors. Since this system can analyze at a microscopic scale, it is likely that further minimization of the system would easily be possible such as protein microarray technology.

  19. Patterning cell using Si-stencil for high-throughput assay

    KAUST Repository

    Wu, Jinbo

    2011-01-01

    In this communication, we report a newly developed cell pattering methodology by a silicon-based stencil, which exhibited advantages such as easy handling, reusability, hydrophilic surface and mature fabrication technologies. Cell arrays obtained by this method were used to investigate cell growth under a temperature gradient, which demonstrated the possibility of studying cell behavior in a high-throughput assay. This journal is © The Royal Society of Chemistry 2011.

  20. Quantum Chemistry; A concise introduction for students of physics, chemistry, biochemistry and materials science

    Science.gov (United States)

    Thakkar, Ajit J.

    2017-09-01

    This book provides non-specialists with a basic understanding of the underlying concepts of quantum chemistry. It is both a text for second- or third-year undergraduates and a reference for researchers who need a quick introduction or refresher. All chemists and many biochemists, materials scientists, engineers, and physicists routinely use spectroscopic measurements and electronic structure computations in their work. The emphasis of Quantum Chemistry on explaining ideas rather than enumerating facts or presenting procedural details makes this an excellent foundation text/reference.

  1. Density functional theory in quantum chemistry

    CERN Document Server

    Tsuneda, Takao

    2014-01-01

    This book examines density functional theory based on the foundation of quantum chemistry. Unconventional in approach, it reviews basic concepts, then describes the physical meanings of state-of-the-art exchange-correlation functionals and their corrections.

  2. Solutions to selected exercise problems in quantum chemistry and spectroscopy

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    Suggested solutions to a number of problems from the collection "Exercise Problems in Quantum Chemistry and Spectroscopy", previously published on ResearchGate (DOI: 10.13140/RG.2.1.4024.8162).......Suggested solutions to a number of problems from the collection "Exercise Problems in Quantum Chemistry and Spectroscopy", previously published on ResearchGate (DOI: 10.13140/RG.2.1.4024.8162)....

  3. Big Data Meets Quantum Chemistry Approximations: The Δ-Machine Learning Approach.

    Science.gov (United States)

    Ramakrishnan, Raghunathan; Dral, Pavlo O; Rupp, Matthias; von Lilienfeld, O Anatole

    2015-05-12

    Chemically accurate and comprehensive studies of the virtual space of all possible molecules are severely limited by the computational cost of quantum chemistry. We introduce a composite strategy that adds machine learning corrections to computationally inexpensive approximate legacy quantum methods. After training, highly accurate predictions of enthalpies, free energies, entropies, and electron correlation energies are possible, for significantly larger molecular sets than used for training. For thermochemical properties of up to 16k isomers of C7H10O2 we present numerical evidence that chemical accuracy can be reached. We also predict electron correlation energy in post Hartree-Fock methods, at the computational cost of Hartree-Fock, and we establish a qualitative relationship between molecular entropy and electron correlation. The transferability of our approach is demonstrated, using semiempirical quantum chemistry and machine learning models trained on 1 and 10% of 134k organic molecules, to reproduce enthalpies of all remaining molecules at density functional theory level of accuracy.

  4. Using quantum chemistry muscle to flex massive systems: How to respond to something perturbing

    Energy Technology Data Exchange (ETDEWEB)

    Bertoni, Colleen [Iowa State Univ., Ames, IA (United States)

    2016-12-17

    Computational chemistry uses the theoretical advances of quantum mechanics and the algorithmic and hardware advances of computer science to give insight into chemical problems. It is currently possible to do highly accurate quantum chemistry calculations, but the most accurate methods are very computationally expensive. Thus it is only feasible to do highly accurate calculations on small molecules, since typically more computationally efficient methods are also less accurate. The overall goal of my dissertation work has been to try to decrease the computational expense of calculations without decreasing the accuracy. In particular, my dissertation work focuses on fragmentation methods, intermolecular interactions methods, analytic gradients, and taking advantage of new hardware.

  5. Faster quantum chemistry simulation on fault-tolerant quantum computers

    International Nuclear Information System (INIS)

    Cody Jones, N; McMahon, Peter L; Yamamoto, Yoshihisa; Whitfield, James D; Yung, Man-Hong; Aspuru-Guzik, Alán; Van Meter, Rodney

    2012-01-01

    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource. To improve execution time, we examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay–Kitaev algorithm (Dawson and Nielsen 2006 Quantum Inform. Comput. 6 81). For a given approximation error ϵ, arbitrary single-qubit gates can be produced fault-tolerantly and using a restricted set of gates in time which is O(log ϵ) or O(log log ϵ); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for lithium hydride. (paper)

  6. Enabling new capabilities and insights from quantum chemistry by using component architectures

    International Nuclear Information System (INIS)

    Janssen, C L; Kenny, J P; Nielsen, I M B; Krishnan, M; Gurumoorthi, V; Valeev, E F; Windus, T L

    2006-01-01

    Steady performance gains in computing power, as well as improvements in Scientific computing algorithms, are making possible the study of coupled physical phenomena of great extent and complexity. The software required for such studies is also very complex and requires contributions from experts in multiple disciplines. We have investigated the use of the Common Component Architecture (CCA) as a mechanism to tackle some of the resulting software engineering challenges in quantum chemistry, focusing on three specific application areas. In our first application, we have developed interfaces permitting solvers and quantum chemistry packages to be readily exchanged. This enables our quantum chemistry packages to be used with alternative solvers developed by specialists, remedying deficiencies we discovered in the native solvers provided in each of the quantum chemistry packages. The second application involves development of a set of components designed to improve utilization of parallel machines by allowing multiple components to execute concurrently on subsets of the available processors. This was found to give substantial improvements in parallel scalability. Our final application is a set of components permitting different quantum chemistry packages to interchange intermediate data. These components enabled the investigation of promising new methods for obtaining accurate thermochemical data for reactions involving heavy elements

  7. Development of combinatorial chemistry methods for coatings: high-throughput adhesion evaluation and scale-up of combinatorial leads.

    Science.gov (United States)

    Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia

    2003-01-01

    Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.

  8. Quantum chemistry literature data base

    International Nuclear Information System (INIS)

    Ohno, Kimio; Morokuma, Keiji

    1982-01-01

    Ab initio computations of atomic and molecular electronic structure now appear in so many journals that it is very difficult for interested scientistics to locate proper and comprehensive references. This book is designed to help them and contains more than 2500 references to the literature published in the years 1978-1980. These have been gathered from nineteen well-known international core journals by quantum chemists themselves and the result is a thorough bibliography. Each entry is a full reference consisting of the following items: (1) authors, (2) journal name, volume, page and year, (3) compounds, (4) methods of calculation, (5) basis sets, (6) calculated properties, and (7) comments. For easy access to the references, the reader can consult the compound and author indexes. A short article on the reliability of ab initio calculations is included as an appendix; this gives a rough idea about the accuracy of the calculated results reported. As the book has been complied using the resources of a computer data base of quantum chemistry literature, it is particularly up to date and the authors will be able to provide supplements regularly. This bibliography will be an asset to large departments of chemistry and all university libraries. (orig.)

  9. High throughput experimentation for the discovery of new catalysts

    International Nuclear Information System (INIS)

    Thomson, S.; Hoffmann, C.; Johann, T.; Wolf, A.; Schmidt, H.-W.; Farrusseng, D.; Schueth, F.

    2002-01-01

    Full text: The use of combinatorial chemistry to obtain new materials has been developed extensively by the pharmaceutical and biochemical industries, but such approaches have been slow to impact on the field of heterogeneous catalysis. The reasons for this lie in with difficulties associated in the synthesis, characterisation and determination of catalytic properties of such materials. In many synthetic and catalytic reactions, the conditions used are difficult to emulate using High Throughput Experimentation (HTE). Furthermore, the ability to screen these catalysts simultaneously in real time, requires the development and/or modification of characterisation methods. Clearly, there is a need for both high throughput synthesis and screening of new and novel reactions, and we describe several new concepts that help to achieve these goals. Although such problems have impeded the development of combinatorial catalysis, the fact remains that many highly attractive processes still exist for which no suitable catalysts have been developed. The ability to decrease the tiFme needed to evaluate catalyst is therefore essential and this makes the use of high throughput techniques highly desirable. In this presentation we will describe the synthesis, catalytic testing, and novel screening methods developed at the Max Planck Institute. Automated synthesis procedures, performed by the use of a modified Gilson pipette robot, will be described, as will the development of two 16 and 49 sample fixed bed reactors and two 25 and 29 sample three phase reactors for catalytic testing. We will also present new techniques for the characterisation of catalysts and catalytic products using standard IR microscopy and infrared focal plane array detection, respectively

  10. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  11. Meeting Report: High-Throughput Technologies for In Vivo Imaging Agents

    Directory of Open Access Journals (Sweden)

    Robert J. Gillies

    2005-04-01

    Full Text Available Combinatorial chemistry and high-throughput screening have become standard tools for discovering new drug candidates with suitable pharmacological properties. Now, those same technologies are starting to be applied to the problem of discovering novel in vivo imaging agents. Important differences in the biological and pharmacological properties needed for imaging agents, compared to those for a therapeutic agent, require new screening methods that emphasize those characteristics, such as optimized residence time and tissue specificity, that make for a good imaging agent candidate.

  12. Cold molecules: Progress in quantum engineering of chemistry and quantum matter

    Science.gov (United States)

    Bohn, John L.; Rey, Ana Maria; Ye, Jun

    2017-09-01

    Cooling atoms to ultralow temperatures has produced a wealth of opportunities in fundamental physics, precision metrology, and quantum science. The more recent application of sophisticated cooling techniques to molecules, which has been more challenging to implement owing to the complexity of molecular structures, has now opened the door to the longstanding goal of precisely controlling molecular internal and external degrees of freedom and the resulting interaction processes. This line of research can leverage fundamental insights into how molecules interact and evolve to enable the control of reaction chemistry and the design and realization of a range of advanced quantum materials.

  13. Quantum dots for a high-throughput Pfu polymerase based multi-round polymerase chain reaction (PCR).

    Science.gov (United States)

    Sang, Fuming; Zhang, Zhizhou; Yuan, Lin; Liu, Deli

    2018-02-26

    Multi-round PCR is an important technique for obtaining enough target DNA from rare DNA resources, and is commonly used in many fields including forensic science, ancient DNA analysis and cancer research. However, multi-round PCR is often aborted, largely due to the accumulation of non-specific amplification during repeated amplifications. Here, we developed a Pfu polymerase based multi-round PCR technique assisted by quantum dots (QDs). Different PCR assays, DNA polymerases (Pfu and Taq), DNA sizes and GC amounts were compared in this study. In the presence of QDs, PCR specificity could be retained even in the ninth-round amplification. Moreover, the longer and more complex the targets were, the earlier the abortion happened in multi-round PCR. However, no obvious enhancement of specificity was found in multi-round PCR using Taq DNA polymerase. Significantly, the fidelity of Pfu polymerase based multi-round PCR was not sacrificed in the presence of QDs. Besides, pre-incubation at 50 °C for an hour had no impact on multi-round PCR performance, which further authenticated the hot start effect of QDs modulated in multi-round PCR. The findings of this study demonstrated that a cost-effective and promising multi-round PCR technique for large-scale and high-throughput sample analysis could be established with high specificity, sensibility and accuracy.

  14. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  15. High throughput octal alpha/gamma spectrometer for low level bioassay estimations

    International Nuclear Information System (INIS)

    Bhasin, B.D.; Shirke, S.H.; Suri, M.M.; Vaidya, P.P.; Ghodgaonkar, M.D.

    1995-01-01

    The present paper describes the development of a high throughput octal alpha spectrometry system specially developed for the estimation of low levels of actinides in bioassay and environmental samples. The system processes simultaneously the outputs coming from eight independent detectors. It can be configured to simultaneously record low level alpha and gamma spectra. The high throughput is achieved by using a prioritised multiplexer router. The prioritised multiplexing and routing coupled with fast 8K ADC (conversion time 20 μsec) allow simultaneous acquisition of multiple spectra without any significant loss in counts. The dual (8K, 24bit) port memory facilitates easy online viewing of spectrum buildup. A menu driven user friendly software makes the operating system convenient to use. A specially developed software provides built-in routines for processing the spectra and estimating the isotopic activity. The interactive mode of software provides easy identification of isotopes compatible with the separation chemistry of different actinides. (author). 6 refs., 2 figs

  16. Quantum chemistry and scientific calculus

    International Nuclear Information System (INIS)

    Gervais, H.P.

    1988-01-01

    The 1988 progress report of the Polytechnic School research team, concerning the quantum chemistry and the scientific calculus. The research program involves the following topics: the transition metals - carbon monoxide systems, which are a suitable model for the chemisorption phenomena; the introduction of the vibronic perturbations in the magnetic screen constants; the gauge invariance method (used in the calculation of the magnetic perturbations), extended to the case of the static or dynamic electrical polarizabilities. The published papers, the congress communications and the thesis are listed [fr

  17. High-throughput experimentation in synthetic polymer chemistry: From RAFT and anionic polymerizations to process development

    NARCIS (Netherlands)

    Guerrero-Sanchez, C.A.; Paulus, R.M.; Fijten, M.W.M.; Mar, de la M.J.; Hoogenboom, R.; Schubert, U.S.

    2006-01-01

    The application of combinatorial and high-throughput approaches in polymer research is described. An overview of the utilized synthesis robots is given, including different parallel synthesizers and a process development robot. In addition, the application of the parallel synthesis robots to

  18. High-throughput characterization methods for lithium batteries

    Directory of Open Access Journals (Sweden)

    Yingchun Lyu

    2017-09-01

    Full Text Available The development of high-performance lithium ion batteries requires the discovery of new materials and the optimization of key components. By contrast with traditional one-by-one method, high-throughput method can synthesize and characterize a large number of compositionally varying samples, which is able to accelerate the pace of discovery, development and optimization process of materials. Because of rapid progress in thin film and automatic control technologies, thousands of compounds with different compositions could be synthesized rapidly right now, even in a single experiment. However, the lack of rapid or combinatorial characterization technologies to match with high-throughput synthesis methods, limit the application of high-throughput technology. Here, we review a series of representative high-throughput characterization methods used in lithium batteries, including high-throughput structural and electrochemical characterization methods and rapid measuring technologies based on synchrotron light sources.

  19. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    International Nuclear Information System (INIS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-01-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources. (paper)

  20. New high-throughput material-exploration system based on combinatorial chemistry and electrostatic atomization

    International Nuclear Information System (INIS)

    Fujimoto, K.; Takahashi, H.; Ito, S.; Inoue, S.; Watanabe, M.

    2006-01-01

    As a tool to facilitate future material explorations, our group has developed a new combinatorial system for the high-throughput preparation of compounds made up of more than three components. The system works in two steps: the atomization of a liquid by a high electric field followed by deposition to a grounded substrate. The combinatorial system based on this method has plural syringe pumps. The each starting materials are fed through the syringe pumps into a manifold, thoroughly mixed as they pass through the manifold, and atomized from the tip of a stainless steel nozzle onto a grounded substrate

  1. Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree-Fock.

    Science.gov (United States)

    Tamayo-Mendoza, Teresa; Kreisbeck, Christoph; Lindh, Roland; Aspuru-Guzik, Alán

    2018-05-23

    Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult , a Hartree-Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.

  2. Quantum Monte Carlo tunneling from quantum chemistry to quantum annealing

    Science.gov (United States)

    Mazzola, Guglielmo; Smelyanskiy, Vadim N.; Troyer, Matthias

    2017-10-01

    Quantum tunneling is ubiquitous across different fields, from quantum chemical reactions and magnetic materials to quantum simulators and quantum computers. While simulating the real-time quantum dynamics of tunneling is infeasible for high-dimensional systems, quantum tunneling also shows up in quantum Monte Carlo (QMC) simulations, which aim to simulate quantum statistics with resources growing only polynomially with the system size. Here we extend the recent results obtained for quantum spin models [Phys. Rev. Lett. 117, 180402 (2016), 10.1103/PhysRevLett.117.180402], and we study continuous-variable models for proton transfer reactions. We demonstrate that QMC simulations efficiently recover the scaling of ground-state tunneling rates due to the existence of an instanton path, which always connects the reactant state with the product. We discuss the implications of our results in the context of quantum chemical reactions and quantum annealing, where quantum tunneling is expected to be a valuable resource for solving combinatorial optimization problems.

  3. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  4. Density functional representation of quantum chemistry. II. Local quantum field theories of molecular matter in terms of the charge density operator do not work

    International Nuclear Information System (INIS)

    Primas, H.; Schleicher, M.

    1975-01-01

    A comprehensive review of the attempts to rephrase molecular quantum mechanics in terms of the particle density operator and the current density or phase density operator is given. All pertinent investigations which have come to attention suffer from severe mathematical inconsistencies and are not adequate to the few-body problem of quantum chemistry. The origin of the failure of these attempts is investigated, and it is shown that a realization of a local quantum field theory of molecular matter in terms of observables would presuppose the solution of many highly nontrivial mathematical problems

  5. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  6. High-Throughput Synthetic Chemistry Enabled by Organic Solvent Disintegrating Tablet.

    Science.gov (United States)

    Li, Tingting; Xu, Lei; Xing, Yanjun; Xu, Bo

    2017-01-17

    Synthetic chemistry remains a time- and labor-intensive process of inherent hazardous nature. Our organic solvent disintegrating tablet (O-Tab) technology has shown potential to make industrial/synthetic chemistry more efficient. As is the case with pharmaceutical tablets, our reagent-containing O-Tabs are mechanically strong, but disintegrate rapidly when in contact with reaction media (organic solvents). For O-Tabs containing sensitive chemicals, they can be further coated to insulate them from air and moisture. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Human development VIII: a theory of "deep" quantum chemistry and cell consciousness: quantum chemistry controls genes and biochemistry to give cells and higher organisms consciousness and complex behavior.

    Science.gov (United States)

    Ventegodt, Søren; Hermansen, Tyge Dahl; Flensborg-Madsen, Trine; Nielsen, Maj Lyck; Merrick, Joav

    2006-11-14

    Deep quantum chemistry is a theory of deeply structured quantum fields carrying the biological information of the cell, making it able to remember, intend, represent the inner and outer world for comparison, understand what it "sees", and make choices on its structure, form, behavior and division. We suggest that deep quantum chemistry gives the cell consciousness and all the qualities and abilities related to consciousness. We use geometric symbolism, which is a pre-mathematical and philosophical approach to problems that cannot yet be handled mathematically. Using Occam's razor we have started with the simplest model that works; we presume this to be a many-dimensional, spiral fractal. We suggest that all the electrons of the large biological molecules' orbitals make one huge "cell-orbital", which is structured according to the spiral fractal nature of quantum fields. Consciousness of single cells, multi cellular structures as e.g. organs, multi-cellular organisms and multi-individual colonies (like ants) and human societies can thus be explained by deep quantum chemistry. When biochemical activity is strictly controlled by the quantum-mechanical super-orbital of the cell, this orbital can deliver energetic quanta as biological information, distributed through many fractal levels of the cell to guide form and behavior of an individual single or a multi-cellular organism. The top level of information is the consciousness of the cell or organism, which controls all the biochemical processes. By this speculative work inspired by Penrose and Hameroff we hope to inspire other researchers to formulate more strict and mathematically correct hypothesis on the complex and coherence nature of matter, life and consciousness.

  8. Assessing Advanced High School and Undergraduate Students' Thinking Skills: The Chemistry--From the Nanoscale to Microelectronics Module

    Science.gov (United States)

    Dori, Yehudit Judy; Dangur, Vered; Avargil, Shirly; Peskin, Uri

    2014-01-01

    Chemistry students in Israel have two options for studying chemistry: basic or honors (advanced placement). For instruction in high school honors chemistry courses, we developed a module focusing on abstract topics in quantum mechanics: Chemistry--From the Nanoscale to Microelectronics. The module adopts a visual-conceptual approach, which…

  9. Microscale High-Throughput Experimentation as an Enabling Technology in Drug Discovery: Application in the Discovery of (Piperidinyl)pyridinyl-1H-benzimidazole Diacylglycerol Acyltransferase 1 Inhibitors.

    Science.gov (United States)

    Cernak, Tim; Gesmundo, Nathan J; Dykstra, Kevin; Yu, Yang; Wu, Zhicai; Shi, Zhi-Cai; Vachal, Petr; Sperbeck, Donald; He, Shuwen; Murphy, Beth Ann; Sonatore, Lisa; Williams, Steven; Madeira, Maria; Verras, Andreas; Reiter, Maud; Lee, Claire Heechoon; Cuff, James; Sherer, Edward C; Kuethe, Jeffrey; Goble, Stephen; Perrotto, Nicholas; Pinto, Shirly; Shen, Dong-Ming; Nargund, Ravi; Balkovec, James; DeVita, Robert J; Dreher, Spencer D

    2017-05-11

    Miniaturization and parallel processing play an important role in the evolution of many technologies. We demonstrate the application of miniaturized high-throughput experimentation methods to resolve synthetic chemistry challenges on the frontlines of a lead optimization effort to develop diacylglycerol acyltransferase (DGAT1) inhibitors. Reactions were performed on ∼1 mg scale using glass microvials providing a miniaturized high-throughput experimentation capability that was used to study a challenging S N Ar reaction. The availability of robust synthetic chemistry conditions discovered in these miniaturized investigations enabled the development of structure-activity relationships that ultimately led to the discovery of soluble, selective, and potent inhibitors of DGAT1.

  10. The Evolution of Chemical High-Throughput Experimentation To Address Challenging Problems in Pharmaceutical Synthesis.

    Science.gov (United States)

    Krska, Shane W; DiRocco, Daniel A; Dreher, Spencer D; Shevlin, Michael

    2017-12-19

    The structural complexity of pharmaceuticals presents a significant challenge to modern catalysis. Many published methods that work well on simple substrates often fail when attempts are made to apply them to complex drug intermediates. The use of high-throughput experimentation (HTE) techniques offers a means to overcome this fundamental challenge by facilitating the rational exploration of large arrays of catalysts and reaction conditions in a time- and material-efficient manner. Initial forays into the use of HTE in our laboratories for solving chemistry problems centered around screening of chiral precious-metal catalysts for homogeneous asymmetric hydrogenation. The success of these early efforts in developing efficient catalytic steps for late-stage development programs motivated the desire to increase the scope of this approach to encompass other high-value catalytic chemistries. Doing so, however, required significant advances in reactor and workflow design and automation to enable the effective assembly and agitation of arrays of heterogeneous reaction mixtures and retention of volatile solvents under a wide range of temperatures. Associated innovations in high-throughput analytical chemistry techniques greatly increased the efficiency and reliability of these methods. These evolved HTE techniques have been utilized extensively to develop highly innovative catalysis solutions to the most challenging problems in large-scale pharmaceutical synthesis. Starting with Pd- and Cu-catalyzed cross-coupling chemistry, subsequent efforts expanded to other valuable modern synthetic transformations such as chiral phase-transfer catalysis, photoredox catalysis, and C-H functionalization. As our experience and confidence in HTE techniques matured, we envisioned their application beyond problems in process chemistry to address the needs of medicinal chemists. Here the problem of reaction generality is felt most acutely, and HTE approaches should prove broadly enabling

  11. Chlorophyll fluorescence is a rigorous, high throughput tool to analyze the impacts of genotype, species, and stress on plant and ecosystem productivity

    Science.gov (United States)

    Ewers, B. E.; Pleban, J. R.; Aston, T.; Beverly, D.; Speckman, H. N.; Hosseini, A.; Bretfeld, M.; Edwards, C.; Yarkhunova, Y.; Weinig, C.; Mackay, D. S.

    2017-12-01

    Abiotic and biotic stresses reduce plant productivity, yet high-throughput characterization of plant responses across genotypes, species and stress conditions are limited by both instrumentation and data analysis techniques. Recent developments in chlorophyll a fluorescence measurement at leaf to landscape scales could improve our predictive understanding of plants response to stressors. We analyzed the interaction of species and stress across two crop types, five gymnosperm and two angiosperm tree species from boreal and montane forests, grasses, forbs and shrubs from sagebrush steppe, and 30 tree species from seasonally wet tropical forest. We also analyzed chlorophyll fluorescence and gas exchange data from twelve Brassica rapa crop accessions and 120 recombinant inbred lines to investigate phenotypic responses to drought. These data represent more than 10,000 measurements of fluorescence and allow us to answer two questions 1) are the measurements from high-throughput, hand held and drone-mounted instruments quantitatively similar to lower throughput camera and gas exchange mounted instruments and 2) do the measurements find differences in genotypic, species and environmental stress on plants? We found through regression that the high and low throughput instruments agreed across both individual chlorophyll fluorescence components and calculated ratios and were not different from a 1:1 relationship with correlation greater than 0.9. We used hierarchical Bayesian modeling to test the second question. We found a linear relationship between the fluorescence-derived quantum yield of PSII and the quantum yield of CO2 assimilation from gas-exchange, with a slope of ca. 0.1 indicating that the efficiency of the entire photosynthetic process was about 10% of PSII across genotypes, species and drought stress. Posterior estimates of quantum yield revealed that drought-treatment, genotype and species differences were preserved when accounting for measurement uncertainty

  12. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  13. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  14. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  15. Human Development VIII: A Theory of “Deep” Quantum Chemistry and Cell Consciousness: Quantum Chemistry Controls Genes and Biochemistry to Give Cells and Higher Organisms Consciousness and Complex Behavior

    Directory of Open Access Journals (Sweden)

    Søren Ventegodt

    2006-01-01

    Full Text Available Deep quantum chemistry is a theory of deeply structured quantum fields carrying the biological information of the cell, making it able to remember, intend, represent the inner and outer world for comparison, understand what it “sees”, and make choices on its structure, form, behavior and division. We suggest that deep quantum chemistry gives the cell consciousness and all the qualities and abilities related to consciousness. We use geometric symbolism, which is a pre-mathematical and philosophical approach to problems that cannot yet be handled mathematically. Using Occam’s razor we have started with the simplest model that works; we presume this to be a many-dimensional, spiral fractal. We suggest that all the electrons of the large biological molecules’ orbitals make one huge “cell-orbital”, which is structured according to the spiral fractal nature of quantum fields. Consciousness of single cells, multi cellular structures as e.g. organs, multi-cellular organisms and multi-individual colonies (like ants and human societies can thus be explained by deep quantum chemistry. When biochemical activity is strictly controlled by the quantum-mechanical super-orbital of the cell, this orbital can deliver energetic quanta as biological information, distributed through many fractal levels of the cell to guide form and behavior of an individual single or a multi-cellular organism. The top level of information is the consciousness of the cell or organism, which controls all the biochemical processes. By this speculative work inspired by Penrose and Hameroff we hope to inspire other researchers to formulate more strict and mathematically correct hypothesis on the complex and coherence nature of matter, life and consciousness.

  16. High-Throughput Particle Manipulation Based on Hydrodynamic Effects in Microchannels

    Directory of Open Access Journals (Sweden)

    Chao Liu

    2017-03-01

    Full Text Available Microfluidic techniques are effective tools for precise manipulation of particles and cells, whose enrichment and separation is crucial for a wide range of applications in biology, medicine, and chemistry. Recently, lateral particle migration induced by the intrinsic hydrodynamic effects in microchannels, such as inertia and elasticity, has shown its promise for high-throughput and label-free particle manipulation. The particle migration can be engineered to realize the controllable focusing and separation of particles based on a difference in size. The widespread use of inertial and viscoelastic microfluidics depends on the understanding of hydrodynamic effects on particle motion. This review will summarize the progress in the fundamental mechanisms and key applications of inertial and viscoelastic particle manipulation.

  17. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    Science.gov (United States)

    Putt, Karson S; Pugh, Randall B

    2013-01-01

    Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  18. Alternative algebraic approaches in quantum chemistry

    International Nuclear Information System (INIS)

    Mezey, Paul G.

    2015-01-01

    Various algebraic approaches of quantum chemistry all follow a common principle: the fundamental properties and interrelations providing the most essential features of a quantum chemical representation of a molecule or a chemical process, such as a reaction, can always be described by algebraic methods. Whereas such algebraic methods often provide precise, even numerical answers, nevertheless their main role is to give a framework that can be elaborated and converted into computational methods by involving alternative mathematical techniques, subject to the constraints and directions provided by algebra. In general, algebra describes sets of interrelations, often phrased in terms of algebraic operations, without much concern with the actual entities exhibiting these interrelations. However, in many instances, the very realizations of two, seemingly unrelated algebraic structures by actual quantum chemical entities or properties play additional roles, and unexpected connections between different algebraic structures are often giving new insight. Here we shall be concerned with two alternative algebraic structures: the fundamental group of reaction mechanisms, based on the energy-dependent topology of potential energy surfaces, and the interrelations among point symmetry groups for various distorted nuclear arrangements of molecules. These two, distinct algebraic structures provide interesting interrelations, which can be exploited in actual studies of molecular conformational and reaction processes. Two relevant theorems will be discussed

  19. Alternative algebraic approaches in quantum chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Mezey, Paul G., E-mail: paul.mezey@gmail.com [Canada Research Chair in Scientific Modeling and Simulation, Department of Chemistry and Department of Physics and Physical Oceanography, Memorial University of Newfoundland, 283 Prince Philip Drive, St. John' s, NL A1B 3X7 (Canada)

    2015-01-22

    Various algebraic approaches of quantum chemistry all follow a common principle: the fundamental properties and interrelations providing the most essential features of a quantum chemical representation of a molecule or a chemical process, such as a reaction, can always be described by algebraic methods. Whereas such algebraic methods often provide precise, even numerical answers, nevertheless their main role is to give a framework that can be elaborated and converted into computational methods by involving alternative mathematical techniques, subject to the constraints and directions provided by algebra. In general, algebra describes sets of interrelations, often phrased in terms of algebraic operations, without much concern with the actual entities exhibiting these interrelations. However, in many instances, the very realizations of two, seemingly unrelated algebraic structures by actual quantum chemical entities or properties play additional roles, and unexpected connections between different algebraic structures are often giving new insight. Here we shall be concerned with two alternative algebraic structures: the fundamental group of reaction mechanisms, based on the energy-dependent topology of potential energy surfaces, and the interrelations among point symmetry groups for various distorted nuclear arrangements of molecules. These two, distinct algebraic structures provide interesting interrelations, which can be exploited in actual studies of molecular conformational and reaction processes. Two relevant theorems will be discussed.

  20. Development and operation of a high-throughput accurate-wavelength lens-based spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Bell, Ronald E., E-mail: rbell@pppl.gov [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

    2014-11-15

    A high-throughput spectrometer for the 400–820 nm wavelength range has been developed for charge exchange recombination spectroscopy or general spectroscopy. A large 2160 mm{sup −1} grating is matched with fast f/1.8 200 mm lenses, which provide stigmatic imaging. A precision optical encoder measures the grating angle with an accuracy ≤0.075 arc sec. A high quantum efficiency low-etaloning CCD detector allows operation at longer wavelengths. A patch panel allows input fibers to interface with interchangeable fiber holders that attach to a kinematic mount at the entrance slit. Computer-controlled hardware allows automated control of wavelength, timing, f-number, automated data collection, and wavelength calibration.

  1. Quantum Chemistry of Solids LCAO Treatment of Crystals and Nanostructures

    CERN Document Server

    Evarestov, Robert A

    2012-01-01

    Quantum Chemistry of Solids delivers a comprehensive account of the main features and possibilities of LCAO methods for the first principles calculations of electronic structure of periodic systems. The first part describes the basic theory underlying the LCAO methods  applied to periodic systems and the use of Hartree-Fock(HF), Density Function theory(DFT) and hybrid Hamiltonians. The translation and site symmetry consideration is included to establish connection between k-space solid –state physics and real-space quantum chemistry. The inclusion of electron correlation effects for periodic systems is considered on the basis of localized crystalline orbitals. The possibilities of LCAO methods for chemical bonding analysis in periodic systems are discussed. The second part deals with the applications of LCAO methods  for calculations of bulk crystal properties, including magnetic ordering and crystal structure optimization.  In the second edition two new chapters are added in the application part II of t...

  2. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  3. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  4. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    Directory of Open Access Journals (Sweden)

    Karson S Putt

    Full Text Available Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  5. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  6. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong; Xu, Chao [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China)

    2016-04-15

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  7. Spiers Memorial Lecture. Quantum chemistry: the first seventy years.

    Science.gov (United States)

    McWeeny, Roy

    2007-01-01

    Present-day theoretical chemistry is rooted in Quantum Mechanics. The aim of the opening lecture is to trace the evolution of Quantum Chemistry from the Heitler-London paper of 1927 up to the end of the last century, emphasizing concepts rather than calculations. The importance of symmetry concepts became evident in the early years: one thinks of the necessary anti-symmetry of the wave function under electron permutations, the Pauli principle, the aufbau scheme, and the classification of spectroscopic states. But for chemists perhaps the key concept is embodied in the Hellmann-Feynman theorem, which provides a pictorial interpretation of chemical bonding in terms of classical electrostatic forces exerted on the nuclei by the electron distribution. Much of the lecture is concerned with various electron distribution functions--the electron density, the current density, the spin density, and other 'property densities'--and with their use in interpreting both molecular structure and molecular properties. Other topics touched upon include Response theory and propagators; Chemical groups in molecules and the group function approach; Atoms in molecules and Bader's theory; Electron correlation and the 'pair function'. Finally, some long-standing controversies, in particular the EPR paradox, are re-examined in the context of molecular dissociation. By admitting the concept of symmetry breaking, along with the use of the von Neumann-Dirac statistical ensemble, orthodox quantum mechanics can lead to a convincing picture of the dissociation mechanism.

  8. Quantum Chemistry of Solids The LCAO First Principles Treatment of Crystals

    CERN Document Server

    Evarestov, Robert A

    2007-01-01

    Quantum Chemistry of Solids delivers a comprehensive account of the main features and possibilities of LCAO methods for the first principles calculations of electronic structure of periodic systems. The first part describes the basic theory underlying the LCAO methods applied to periodic systems and the use of wave-function-based (Hartree-Fock), density-based (DFT) and hybrid hamiltonians. The translation and site symmetry consideration is included to establish connection between k-space solid-state physics and real-space quantum chemistry methods in the framework of cyclic model of an infinite crystal. The inclusion of electron correlation effects for periodic systems is considered on the basis of localized crystalline orbitals. The possibilities of LCAO methods for chemical bonding analysis in periodic systems are discussed. The second part deals with the applications of LCAO methods for calculations of bulk crystal properties, including magnetic ordering and crystal structure optimization. The discussion o...

  9. The quantum gamble

    CERN Document Server

    Boeyens, Jan C A

    2016-01-01

    This volume, written by a highly cited author, presents the history of quantum theory together with open questions and remaining problems in terms of the plausibility of quantum chemistry and physics. It also provides insights into the theory of matter-wave mechanics. The content is aimed at students and lecturers in chemistry, physics and the philosophy of science.

  10. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  11. Development of a high-throughput real time PCR based on a hot-start alternative for Pfu mediated by quantum dots

    Science.gov (United States)

    Sang, Fuming; Yang, Yang; Yuan, Lin; Ren, Jicun; Zhang, Zhizhou

    2015-09-01

    Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour preincubation at 50 °C before real time PCR. Moreover, the results obtained by QD-based HS PCR were comparable to a commercial Taq antibody DNA polymerase. However, no obvious HS effect of QDs was found in real time PCR using Taq DNA polymerase. The findings of this study demonstrated that a cost-effective high-throughput real time PCR based on QD triggered HS PCR could be established with high consistency, sensitivity and accuracy.Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour

  12. Quantum Nanobiology and Biophysical Chemistry

    DEFF Research Database (Denmark)

    2013-01-01

    An introduction was provided in the first issue by way of an Editorial to this special two issue volume of Current Physical Chemistry – “Quantum Nanobiology and Biophysical Chemistry” [1]. The Guest Editors would like to thank all the authors and referees who have contributed to this second issue....... Wu et al. use density functional theory to explore the use of Ni/Fe bimetallic nanotechnology in the bioremediation of decabromo-diphenyl esters. Araújo-Chaves et al. explore the binding and reactivity of Mn(III) porphyrins in the membrane mimetic setting of model liposomal systems. Claussen et al....... demonstrate extremely low detection performance of acyl-homoserine lactone in a biologically relevant system using surface enhanced Raman spectroscopy. Sugihara and Bondar evaluate the influence of methyl-groups and the protein environment on retinal geometries in rhodopsin and bacteriorhodopsin, two...

  13. A multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans

    Science.gov (United States)

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-01-01

    The booming nanotech industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At the organism level, our automated system analyzes hundreds of individual animals for body length, locomotion speed, and lifespan. To demonstrate the utility of our system, we applied this technology to test the toxicity of 20 nanomaterials under four concentrations. Only fullerene nanoparticles (nC60), fullerol, TiO2, and CeO2 showed little or no toxicity. Various degrees of toxicity were detected from different forms of carbon nanotubes, graphene, carbon black, Ag, and fumed SiO2 nanoparticles. Aminofullerene and UV irradiated nC60 also showed small but significant toxicity. We further investigated the effects of nanomaterial size, shape, surface chemistry, and exposure conditions on toxicity. Our data are publicly available at the open-access nanotoxicity database www.QuantWorm.org/nano. PMID:25611253

  14. Development of a high throughput single-particle screening for inorganic semiconductor nanorods as neural voltage sensor

    Science.gov (United States)

    Kuo, Yung; Park, Kyoungwon; Li, Jack; Ingargiola, Antonino; Park, Joonhyuck; Shvadchak, Volodymyr; Weiss, Shimon

    2017-08-01

    Monitoring membrane potential in neurons requires sensors with minimal invasiveness, high spatial and temporal (sub-ms) resolution, and large sensitivity for enabling detection of sub-threshold activities. While organic dyes and fluorescent proteins have been developed to possess voltage-sensing properties, photobleaching, cytotoxicity, low sensitivity, and low spatial resolution have obstructed further studies. Semiconductor nanoparticles (NPs), as prospective voltage sensors, have shown excellent sensitivity based on Quantum confined Stark effect (QCSE) at room temperature and at single particle level. Both theory and experiment have shown their voltage sensitivity can be increased significantly via material, bandgap, and structural engineering. Based on theoretical calculations, we synthesized one of the optimal candidates for voltage sensors: 12 nm type-II ZnSe/CdS nanorods (NRs), with an asymmetrically located seed. The voltage sensitivity and spectral shift were characterized in vitro using spectrally-resolved microscopy using electrodes grown by thin film deposition, which "sandwich" the NRs. We characterized multiple batches of such NRs and iteratively modified the synthesis to achieve higher voltage sensitivity (ΔF/F> 10%), larger spectral shift (>5 nm), better homogeneity, and better colloidal stability. Using a high throughput screening method, we were able to compare the voltage sensitivity of our NRs with commercial spherical quantum dots (QDs) with single particle statistics. Our method of high throughput screening with spectrally-resolved microscope also provides a versatile tool for studying single particles spectroscopy under field modulation.

  15. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  16. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    Science.gov (United States)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  17. Development of a Rapid Fluorescence-Based High-Throughput Screening Assay to Identify Novel Kynurenine 3-Monooxygenase Inhibitor Scaffolds.

    Science.gov (United States)

    Jacobs, K R; Guillemin, G J; Lovejoy, D B

    2018-02-01

    Kynurenine 3-monooxygenase (KMO) is a well-validated therapeutic target for the treatment of neurodegenerative diseases, including Alzheimer's disease (AD) and Huntington's disease (HD). This work reports a facile fluorescence-based KMO assay optimized for high-throughput screening (HTS) that achieves a throughput approximately 20-fold higher than the fastest KMO assay currently reported. The screen was run with excellent performance (average Z' value of 0.80) from 110,000 compounds across 341 plates and exceeded all statistical parameters used to describe a robust HTS assay. A subset of molecules was selected for validation by ultra-high-performance liquid chromatography, resulting in the confirmation of a novel hit with an IC 50 comparable to that of the well-described KMO inhibitor Ro-61-8048. A medicinal chemistry program is currently underway to further develop our novel KMO inhibitor scaffolds.

  18. Determination of Quantum Chemistry Based Force Fields for Molecular Dynamics Simulations of Aromatic Polymers

    Science.gov (United States)

    Jaffe, Richard; Langhoff, Stephen R. (Technical Monitor)

    1995-01-01

    Ab initio quantum chemistry calculations for model molecules can be used to parameterize force fields for molecular dynamics simulations of polymers. Emphasis in our research group is on using quantum chemistry-based force fields for molecular dynamics simulations of organic polymers in the melt and glassy states, but the methodology is applicable to simulations of small molecules, multicomponent systems and solutions. Special attention is paid to deriving reliable descriptions of the non-bonded and electrostatic interactions. Several procedures have been developed for deriving and calibrating these parameters. Our force fields for aromatic polyimide simulations will be described. In this application, the intermolecular interactions are the critical factor in determining many properties of the polymer (including its color).

  19. Elementary quantum chemistry

    CERN Document Server

    Pilar, Frank L

    2003-01-01

    Useful introductory course and reference covers origins of quantum theory, Schrödinger wave equation, quantum mechanics of simple systems, electron spin, quantum states of atoms, Hartree-Fock self-consistent field method, more. 1990 edition.

  20. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  1. Laboratory study of nitrate photolysis in Antarctic snow. I. Observed quantum yield, domain of photolysis, and secondary chemistry

    DEFF Research Database (Denmark)

    Meusinger, Carl; Berhanu, Tesfaye A.; Erbland, Joseph

    2014-01-01

    undergoing secondary (recombination) chemistry. Modeled NOx emissions may increase significantly above measured values due to the observed quantum yield in this study. The apparent quantum yield in the 200 nm band was found to be ∼ 1%, much lower than reported for aqueous chemistry. A companion paper...... are understood. It has been shown that photolysis of nitrate in the snowpack plays a major role in nitrate loss and that the photolysis products have a significant influence on the local troposphere as well as on other species in the snow. Reported quantum yields for the main reaction spans orders of magnitude...

  2. Development of massively parallel quantum chemistry program SMASH

    International Nuclear Information System (INIS)

    Ishimura, Kazuya

    2015-01-01

    A massively parallel program for quantum chemistry calculations SMASH was released under the Apache License 2.0 in September 2014. The SMASH program is written in the Fortran90/95 language with MPI and OpenMP standards for parallelization. Frequently used routines, such as one- and two-electron integral calculations, are modularized to make program developments simple. The speed-up of the B3LYP energy calculation for (C 150 H 30 ) 2 with the cc-pVDZ basis set (4500 basis functions) was 50,499 on 98,304 cores of the K computer

  3. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  4. Lateral Temperature-Gradient Method for High-Throughput Characterization of Material Processing by Millisecond Laser Annealing.

    Science.gov (United States)

    Bell, Robert T; Jacobs, Alan G; Sorg, Victoria C; Jung, Byungki; Hill, Megan O; Treml, Benjamin E; Thompson, Michael O

    2016-09-12

    A high-throughput method for characterizing the temperature dependence of material properties following microsecond to millisecond thermal annealing, exploiting the temperature gradients created by a lateral gradient laser spike anneal (lgLSA), is presented. Laser scans generate spatial thermal gradients of up to 5 °C/μm with peak temperatures ranging from ambient to in excess of 1400 °C, limited only by laser power and materials thermal limits. Discrete spatial property measurements across the temperature gradient are then equivalent to independent measurements after varying temperature anneals. Accurate temperature calibrations, essential to quantitative analysis, are critical and methods for both peak temperature and spatial/temporal temperature profile characterization are presented. These include absolute temperature calibrations based on melting and thermal decomposition, and time-resolved profiles measured using platinum thermistors. A variety of spatially resolved measurement probes, ranging from point-like continuous profiling to large area sampling, are discussed. Examples from annealing of III-V semiconductors, CdSe quantum dots, low-κ dielectrics, and block copolymers are included to demonstrate the flexibility, high throughput, and precision of this technique.

  5. Adaptation of quantum chemistry software for the electronic structure calculations on GPU for solid-state systems

    International Nuclear Information System (INIS)

    Gusakov, V.E.; Bel'ko, V.I.; Dorozhkin, N.N.

    2015-01-01

    We report on adaptation of quantum chemistry software - Quantum Espresso and LASTO - for the electronic structure calculations for the complex solid-state systems on the GeForce series GPUs using the nVIDIA CUDA technology. Specifically, protective covering based on transition metal nitrides are considered. (authors)

  6. High-throughput screening (HTS) and modeling of the retinoid ...

    Science.gov (United States)

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  7. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    Science.gov (United States)

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  8. Quantum chemistry calculation and experimental study on coal ash fusion characteristics of coal blend

    Energy Technology Data Exchange (ETDEWEB)

    Chen Yushuang; Zhang Zhong-xiao; Wu Xiao-jiang; Li Jie; Guang Rong-qing; Yan Bo [University of Shanghai for Science and Technology, Shanghai (China). Department of Power Engineering

    2009-07-01

    The coal ash fusion characteristics of high fusibility coal blending with two low fusibility coals respectively were studied. The data were analyzed using quantum chemistry methods and experiment from micro-and macro-molecular structures. The results show that Ca{sup 2+}, as the electron acceptor, easily enters into the lattice of mullite, causing a transition from mullite to anorthite. Mullite is much more stable than anorthite. Ca{sup 2+} of anorthite occupies the larger cavities with the (SiO{sub 4}){sup 4-} tetrahedral or (AlO{sub 4}){sup 5-} tetrahedral rings respectively. Ca atom linked O weakens Si-O bond, leading ash fusion point to reduce effectively. The chemistry, reactivity sites and bond-formation characteristics of minerals can well explain the reaction mechanism refractory minerals and flux ash melting process at high temperature. The results of experiment are agreed with the theory analysis by using ternary phase diagrams and quantitative calculation. 27 refs., 9 figs., 3 tabs.

  9. High-throughput on-chip in vivo neural regeneration studies using femtosecond laser nano-surgery and microfluidics

    Science.gov (United States)

    Rohde, Christopher B.; Zeng, Fei; Gilleland, Cody; Samara, Chrysanthi; Yanik, Mehmet F.

    2009-02-01

    In recent years, the advantages of using small invertebrate animals as model systems for human disease have become increasingly apparent and have resulted in three Nobel Prizes in medicine or chemistry during the last six years for studies conducted on the nematode Caenorhabditis elegans (C. elegans). The availability of a wide array of species-specific genetic techniques, along with the transparency of the worm and its ability to grow in minute volumes make C. elegans an extremely powerful model organism. We present a suite of technologies for complex high-throughput whole-animal genetic and drug screens. We demonstrate a high-speed microfluidic sorter that can isolate and immobilize C. elegans in a well-defined geometry, an integrated chip containing individually addressable screening chambers for incubation and exposure of individual animals to biochemical compounds, and a device for delivery of compound libraries in standard multiwell plates to microfluidic devices. The immobilization stability obtained by these devices is comparable to that of chemical anesthesia and the immobilization process does not affect lifespan, progeny production, or other aspects of animal health. The high-stability enables the use of a variety of key optical techniques. We use this to demonstrate femtosecond-laser nanosurgery and three-dimensional multiphoton microscopy. Used alone or in various combinations these devices facilitate a variety of high-throughput assays using whole animals, including mutagenesis and RNAi and drug screens at subcellular resolution, as well as high-throughput high-precision manipulations such as femtosecond-laser nanosurgery for large-scale in vivo neural degeneration and regeneration studies.

  10. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  11. Development of massively parallel quantum chemistry program SMASH

    Energy Technology Data Exchange (ETDEWEB)

    Ishimura, Kazuya [Department of Theoretical and Computational Molecular Science, Institute for Molecular Science 38 Nishigo-Naka, Myodaiji, Okazaki, Aichi 444-8585 (Japan)

    2015-12-31

    A massively parallel program for quantum chemistry calculations SMASH was released under the Apache License 2.0 in September 2014. The SMASH program is written in the Fortran90/95 language with MPI and OpenMP standards for parallelization. Frequently used routines, such as one- and two-electron integral calculations, are modularized to make program developments simple. The speed-up of the B3LYP energy calculation for (C{sub 150}H{sub 30}){sub 2} with the cc-pVDZ basis set (4500 basis functions) was 50,499 on 98,304 cores of the K computer.

  12. Optimization and high-throughput screening of antimicrobial peptides.

    Science.gov (United States)

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  13. Quantum confinement and surface chemistry of 0.8–1.6 nm hydrosilylated silicon nanocrystals

    International Nuclear Information System (INIS)

    Pi Xiao-Dong; Wang Rong; Yang De-Ren

    2014-01-01

    In the framework of density functional theory (DFT), we have studied the electronic properties of alkene/alkyne-hydrosilylated silicon nanocrystals (Si NCs) in the size range from 0.8 nm to 1.6 nm. Among the alkenes with all kinds of functional groups considered in this work, only those containing —NH 2 and —C 4 H 3 S lead to significant hydrosilylation-induced changes in the gap between the highest occupied molecular orbital (HOMO) and the lowest unoccupied molecular orbital (LUMO) of an Si NC at the ground state. The quantum confinement effect is dominant for all of the alkene-hydrosilylated Si NCs at the ground state. At the excited state, the prevailing effect of surface chemistry only occurs at the smallest (0.8 nm) Si NCs hydrosilylated with alkenes containing —NH 2 and —C 4 H 3 S. Although the alkyne hydrosilylation gives rise to a more significant surface chemistry effect than alkene hydrosilylation, the quantum confinement effect remains dominant for alkyne-hydrosilylated Si NCs at the ground state. However, at the excited state, the effect of surface chemistry induced by the hydrosilylation with conjugated alkynes is strong enough to prevail over that of quantum confinement. (condensed matter: structural, mechanical, and thermal properties)

  14. The use of quantum chemistry in pharmaceutical research as illustrated by case studies of indometacin and carbamazepine

    DEFF Research Database (Denmark)

    Gordon, Keith C; McGoverin, Cushla M; Strachan, Clare J

    2007-01-01

    A number of case studies that illustrate how quantum chemistry may be used in studying pharmaceutical systems are reviewed. A brief introduction to quantum methods is provided and the use of these methods in understanding the structure and properties of indometacin and carbamazepine is discussed...

  15. Use of combinatorial chemistry to speed drug discovery.

    Science.gov (United States)

    Rádl, S

    1998-10-01

    IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.

  16. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  17. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  18. Quantum chemistry-assisted synthesis route development

    International Nuclear Information System (INIS)

    Hori, Kenji; Sumimoto, Michinori; Murafuji, Toshihiro

    2015-01-01

    We have been investigating “quantum chemistry-assisted synthesis route development” using in silico screenings and applied the method to several targets. Another example was conducted to develop synthesis routes for a urea derivative, namely 1-(4-(trifluoromethyl)-2-oxo-2H-chromen-7-yl)urea. While five synthesis routes were examined, only three routes passed the second in silico screening. Among them, the reaction of 7-amino-4-(trifluoromethyl)-2H-chromen-2-one and O-methyl carbamate with BF 3 as an additive was ranked as the first choice for synthetic work. We were able to experimentally obtain the target compound even though its yield was as low as 21 %. The theoretical result was thus consistent with that observed. The summary of transition state data base (TSDB) is also provided. TSDB is the key to reducing time of in silico screenings

  19. High-throughput screening of chemicals as functional ...

    Science.gov (United States)

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  20. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  1. High throughput imaging cytometer with acoustic focussing.

    Science.gov (United States)

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  2. High-throughput GPU-based LDPC decoding

    Science.gov (United States)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  3. Relativistic quantum chemistry the fundamental theory of molecular science

    CERN Document Server

    Reiher, Markus

    2014-01-01

    Einstein proposed his theory of special relativity in 1905. For a long time it was believed that this theory has no significant impact on chemistry. This view changed in the 1970s when it was realized that (nonrelativistic) Schrödinger quantum mechanics yields results on molecular properties that depart significantly from experimental results. Especially when heavy elements are involved, these quantitative deviations can be so large that qualitative chemical reasoning and understanding is affected. For this to grasp the appropriate many-electron theory has rapidly evolved. Nowadays relativist

  4. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    Science.gov (United States)

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  5. Introducing Discrete Frequency Infrared Technology for High-Throughput Biofluid Screening

    Science.gov (United States)

    Hughes, Caryn; Clemens, Graeme; Bird, Benjamin; Dawson, Timothy; Ashton, Katherine M.; Jenkinson, Michael D.; Brodbelt, Andrew; Weida, Miles; Fotheringham, Edeline; Barre, Matthew; Rowlette, Jeremy; Baker, Matthew J.

    2016-02-01

    Accurate early diagnosis is critical to patient survival, management and quality of life. Biofluids are key to early diagnosis due to their ease of collection and intimate involvement in human function. Large-scale mid-IR imaging of dried fluid deposits offers a high-throughput molecular analysis paradigm for the biomedical laboratory. The exciting advent of tuneable quantum cascade lasers allows for the collection of discrete frequency infrared data enabling clinically relevant timescales. By scanning targeted frequencies spectral quality, reproducibility and diagnostic potential can be maintained while significantly reducing acquisition time and processing requirements, sampling 16 serum spots with 0.6, 5.1 and 15% relative standard deviation (RSD) for 199, 14 and 9 discrete frequencies respectively. We use this reproducible methodology to show proof of concept rapid diagnostics; 40 unique dried liquid biopsies from brain, breast, lung and skin cancer patients were classified in 2.4 cumulative seconds against 10 non-cancer controls with accuracies of up to 90%.

  6. Photodissociation of quantum state-selected diatomic molecules yields new insight into ultracold chemistry

    Science.gov (United States)

    McDonald, Mickey; McGuyer, Bart H.; Lee, Chih-Hsi; Apfelbeck, Florian; Zelevinsky, Tanya

    2016-05-01

    When a molecule is subjected to a sufficiently energetic photon it can break apart into fragments through a process called ``photodissociation''. For over 70 years this simple chemical reaction has served as a vital experimental tool for acquiring information about molecular structure, since the character of the photodissociative transition can be inferred by measuring the 3D photofragment angular distribution (PAD). While theoretical understanding of this process has gradually evolved from classical considerations to a fully quantum approach, experiments to date have not yet revealed the full quantum nature of this process. In my talk I will describe recent experiments involving the photodissociation of ultracold, optical lattice-trapped, and fully quantum state-resolved 88Sr2 molecules. Optical absorption images of the PADs produced in these experiments reveal features which are inherently quantum mechanical in nature, such as matter-wave interference between output channels, and are sensitive to the quantum statistics of the molecular wavefunctions. The results of these experiments cannot be predicted using quasiclassical methods. Instead, we describe our results with a fully quantum mechanical model yielding new intuition about ultracold chemistry.

  7. Affinity selection-mass spectrometry and its emerging application to the high throughput screening of G protein-coupled receptors.

    Science.gov (United States)

    Whitehurst, Charles E; Annis, D Allen

    2008-07-01

    Advances in combinatorial chemistry and genomics have inspired the development of novel affinity selection-based screening techniques that rely on mass spectrometry to identify compounds that preferentially bind to a protein target. Of the many affinity selection-mass spectrometry techniques so far documented, only a few solution-based implementations that separate target-ligand complexes away from unbound ligands persist today as routine high throughput screening platforms. Because affinity selection-mass spectrometry techniques do not rely on radioactive or fluorescent reporters or enzyme activities, they can complement traditional biochemical and cell-based screening assays and enable scientists to screen targets that may not be easily amenable to other methods. In addition, by employing mass spectrometry for ligand detection, these techniques enable high throughput screening of massive library collections of pooled compound mixtures, vastly increasing the chemical space that a target can encounter during screening. Of all drug targets, G protein coupled receptors yield the highest percentage of therapeutically effective drugs. In this manuscript, we present the emerging application of affinity selection-mass spectrometry to the high throughput screening of G protein coupled receptors. We also review how affinity selection-mass spectrometry can be used as an analytical tool to guide receptor purification, and further used after screening to characterize target-ligand binding interactions, enabling the classification of orthosteric and allosteric binders.

  8. AutoClickChem: click chemistry in silico.

    Directory of Open Access Journals (Sweden)

    Jacob D Durrant

    Full Text Available Academic researchers and many in industry often lack the financial resources available to scientists working in "big pharma." High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu.

  9. AutoClickChem: click chemistry in silico.

    Science.gov (United States)

    Durrant, Jacob D; McCammon, J Andrew

    2012-01-01

    Academic researchers and many in industry often lack the financial resources available to scientists working in "big pharma." High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu.

  10. The unitary-group formulation of quantum chemistry

    International Nuclear Information System (INIS)

    Campbell, L.L.

    1990-01-01

    The major part of this dissertation establishes group theoretical techniques that are applicable to the quantum-mechanical many-body atomic and molecular problems. Several matrix element evaluation methods for many-body states are developed. The generator commutation method using generator states is presented for the first time as a complete algorithm, and a computer implementation of the method is developed. A major result of this work is the development of a new method of calculation called the freeon tensor product (FTP) method. This method is much simpler and for many purposes superior to the GUGA procedure (graphical unitary group approach), widely used in configuration interaction calculations. This dissertation is also concerned with the prediction of atomic spectra. In principle spectra can be computed by the methods of ab initio quantum chemistry. In practice these computations are difficult, expensive, time consuming, and not uniformly successful. In this dissertation, the author employs a semi-empirical group theoretical analysis of discrete spectra is the exact analog of the Fourier analysis of continuous functions. In particular, he focuses on the spectra of atoms with incomplete p, d, and f shells. The formulas and techniques are derived in a fashion that apply equally well for more complex systems, as well as the isofreeon model of spherical nuclei

  11. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  12. Students' Levels of Explanations, Models, and Misconceptions in Basic Quantum Chemistry: A Phenomenographic Study

    Science.gov (United States)

    Stefani, Christina; Tsaparlis, Georgios

    2009-01-01

    We investigated students' knowledge constructions of basic quantum chemistry concepts, namely atomic orbitals, the Schrodinger equation, molecular orbitals, hybridization, and chemical bonding. Ausubel's theory of meaningful learning provided the theoretical framework and phenomenography the method of analysis. The semi-structured interview with…

  13. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  14. Ab initio quantum chemistry for combustion

    International Nuclear Information System (INIS)

    Page, M.; Lengsfield, B.H.

    1991-01-01

    Advances in theoretical and computational methods, coupled with the rapid development of powerful and inexpensive computers, fuel the current rapid development in computational quantum chemistry (QC). Nowhere is this more evident than in the areas of QC most relevant to combustion: the description of bond breaking and rate phenomena. although the development of faster computers with larger memories has had a major impact on the scope of problems that can be addressed with QC, the development of new theoretical techniques and capabilities is responsible for adding new dimensions in QC and has paved the way for the unification of QC electronic structure calculations with statistical and dynamical models of chemical reactions. These advances will be stressed in this chapter. This paper describes past accomplishments selectively to set the stage for discussion of ideas or techniques that we believe will have significant impact on combustion research. Thus, the focus of the chapter is as much on the future as it is on the past

  15. High-throughput theoretical design of lithium battery materials

    International Nuclear Information System (INIS)

    Ling Shi-Gang; Gao Jian; Xiao Rui-Juan; Chen Li-Quan

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. (topical review)

  16. High-throughput materials discovery and development: breakthroughs and challenges in the mapping of the materials genome

    Science.gov (United States)

    Buongiorno Nardelli, Marco

    High-Throughput Quantum-Mechanics computation of materials properties by ab initio methods has become the foundation of an effective approach to materials design, discovery and characterization. This data driven approach to materials science currently presents the most promising path to the development of advanced technological materials that could solve or mitigate important social and economic challenges of the 21st century. In particular, the rapid proliferation of computational data on materials properties presents the possibility to complement and extend materials property databases where the experimental data is lacking and difficult to obtain. Enhanced repositories such as AFLOWLIB open novel opportunities for structure discovery and optimization, including uncovering of unsuspected compounds, metastable structures and correlations between various properties. The practical realization of these opportunities depends almost exclusively on the the design of efficient algorithms for electronic structure simulations of realistic material systems beyond the limitations of the current standard theories. In this talk, I will review recent progress in theoretical and computational tools, and in particular, discuss the development and validation of novel functionals within Density Functional Theory and of local basis representations for effective ab-initio tight-binding schemes. Marco Buongiorno Nardelli is a pioneer in the development of computational platforms for theory/data/applications integration rooted in his profound and extensive expertise in the design of electronic structure codes and in his vision for sustainable and innovative software development for high-performance materials simulations. His research activities range from the design and discovery of novel materials for 21st century applications in renewable energy, environment, nano-electronics and devices, the development of advanced electronic structure theories and high-throughput techniques in

  17. High-Throughput Block Optical DNA Sequence Identification.

    Science.gov (United States)

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  19. High throughput salt separation from uranium deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S.W.; Park, K.M.; Kim, J.G.; Kim, I.T.; Park, S.B., E-mail: swkwon@kaeri.re.kr [Korea Atomic Energy Research Inst. (Korea, Republic of)

    2014-07-01

    It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites in pyroprocessing. Multilayer porous crucible system was proposed to increase a throughput of the salt distiller in this study. An integrated sieve-crucible assembly was also investigated for the practical use of the porous crucible system. The salt evaporation behaviors were compared between the conventional nonporous crucible and the porous crucible. Two step weight reductions took place in the porous crucible, whereas the salt weight reduced only at high temperature by distillation in a nonporous crucible. The first weight reduction in the porous crucible was caused by the liquid salt penetrated out through the perforated crucible during the temperature elevation until the distillation temperature. Multilayer porous crucibles have a benefit to expand the evaporation surface area. (author)

  20. High throughput, low set-up time reconfigurable linear feedback shift registers

    NARCIS (Netherlands)

    Nas, R.J.M.; Berkel, van C.H.

    2010-01-01

    This paper presents a hardware design for a scalable, high throughput, configurable LFSR. High throughput is achieved by producing L consecutive outputs per clock cycle with a clock cycle period that, for practical cases, increases only logarithmically with the block size L and the length of the

  1. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  2. The molecular electron density distribution meeting place of X-ray diffraction and quantum chemistry intermediate - between theory and experiment

    NARCIS (Netherlands)

    Feil, D.; Feil, Dirk

    1992-01-01

    Quantum chemistry and the concepts used daily in chemistry are increasingly growing apart. Among the concepts that are able to bridge the gap between theory and experimental practice, electron density distribution has an important place. The study of this distribution has led to new developments in

  3. On the applicability of one- and many-electron quantum chemistry models for hydrated electron clusters

    Science.gov (United States)

    Turi, László

    2016-04-01

    We evaluate the applicability of a hierarchy of quantum models in characterizing the binding energy of excess electrons to water clusters. In particular, we calculate the vertical detachment energy of an excess electron from water cluster anions with methods that include one-electron pseudopotential calculations, density functional theory (DFT) based calculations, and ab initio quantum chemistry using MP2 and eom-EA-CCSD levels of theory. The examined clusters range from the smallest cluster size (n = 2) up to nearly nanosize clusters with n = 1000 molecules. The examined cluster configurations are extracted from mixed quantum-classical molecular dynamics trajectories of cluster anions with n = 1000 water molecules using two different one-electron pseudopotenial models. We find that while MP2 calculations with large diffuse basis set provide a reasonable description for the hydrated electron system, DFT methods should be used with precaution and only after careful benchmarking. Strictly tested one-electron psudopotentials can still be considered as reasonable alternatives to DFT methods, especially in large systems. The results of quantum chemistry calculations performed on configurations, that represent possible excess electron binding motifs in the clusters, appear to be consistent with the results using a cavity structure preferring one-electron pseudopotential for the hydrated electron, while they are in sharp disagreement with the structural predictions of a non-cavity model.

  4. On the applicability of one- and many-electron quantum chemistry models for hydrated electron clusters

    Energy Technology Data Exchange (ETDEWEB)

    Turi, László, E-mail: turi@chem.elte.hu [Department of Physical Chemistry, Eötvös Loránd University, P.O. Box 32, H-1518 Budapest 112 (Hungary)

    2016-04-21

    We evaluate the applicability of a hierarchy of quantum models in characterizing the binding energy of excess electrons to water clusters. In particular, we calculate the vertical detachment energy of an excess electron from water cluster anions with methods that include one-electron pseudopotential calculations, density functional theory (DFT) based calculations, and ab initio quantum chemistry using MP2 and eom-EA-CCSD levels of theory. The examined clusters range from the smallest cluster size (n = 2) up to nearly nanosize clusters with n = 1000 molecules. The examined cluster configurations are extracted from mixed quantum-classical molecular dynamics trajectories of cluster anions with n = 1000 water molecules using two different one-electron pseudopotenial models. We find that while MP2 calculations with large diffuse basis set provide a reasonable description for the hydrated electron system, DFT methods should be used with precaution and only after careful benchmarking. Strictly tested one-electron psudopotentials can still be considered as reasonable alternatives to DFT methods, especially in large systems. The results of quantum chemistry calculations performed on configurations, that represent possible excess electron binding motifs in the clusters, appear to be consistent with the results using a cavity structure preferring one-electron pseudopotential for the hydrated electron, while they are in sharp disagreement with the structural predictions of a non-cavity model.

  5. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  6. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  7. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    Science.gov (United States)

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  8. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  9. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  10. A High-Throughput Mass Spectrometry Assay Coupled with Redox Activity Testing Reduces Artifacts and False Positives in Lysine Demethylase Screening.

    Science.gov (United States)

    Wigle, Tim J; Swinger, Kerren K; Campbell, John E; Scholle, Michael D; Sherrill, John; Admirand, Elizabeth A; Boriack-Sjodin, P Ann; Kuntz, Kevin W; Chesworth, Richard; Moyer, Mikel P; Scott, Margaret Porter; Copeland, Robert A

    2015-07-01

    Demethylation of histones by lysine demethylases (KDMs) plays a critical role in controlling gene transcription. Aberrant demethylation may play a causal role in diseases such as cancer. Despite the biological significance of these enzymes, there are limited assay technologies for study of KDMs and few quality chemical probes available to interrogate their biology. In this report, we demonstrate the utility of self-assembled monolayer desorption/ionization (SAMDI) mass spectrometry for the investigation of quantitative KDM enzyme kinetics and for high-throughput screening for KDM inhibitors. SAMDI can be performed in 384-well format and rapidly allows reaction components to be purified prior to injection into a mass spectrometer, without a throughput-limiting liquid chromatography step. We developed sensitive and robust assays for KDM1A (LSD1, AOF2) and KDM4C (JMJD2C, GASC1) and screened 13,824 compounds against each enzyme. Hits were rapidly triaged using a redox assay to identify compounds that interfered with the catalytic oxidation chemistry used by the KDMs for the demethylation reaction. We find that overall this high-throughput mass spectrometry platform coupled with the elimination of redox active compounds leads to a hit rate that is manageable for follow-up work. © 2015 Society for Laboratory Automation and Screening.

  11. Expression of results in quantum chemistry physical chemistry division commission on physicochemical symbols, terminology and units

    CERN Document Server

    Whiffen, D H

    2013-01-01

    Expression of Results in Quantum Chemistry recommends the appropriate insertion of physical constants in the output information of a theoretical paper in order to make the numerical end results of theoretical work easily transformed to SI units by the reader. The acceptance of this recommendation would circumvent the need for a set of atomic units each with its own symbol and name. It is the traditional use of the phrase """"atomic units"""" in this area which has obscured the real problem. The four SI dimensions of length, mass, time, and current require four physical constants to be permitte

  12. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  13. A Synthesis of Fluid Dynamics and Quantum Chemistry for the Design of Nanoelectronics

    Science.gov (United States)

    MacDougall, Preston J.

    1998-01-01

    In 1959, during a famous lecture entitled "There's Plenty of Room at the Bottom", Richard Feynman focused on the startling technical possibilities that would exist at the limit of miniaturization, that being atomically precise devices with dimensions in the nanometer range. A nanometer is both a convenient unit of length for medium to large sized molecules, and the root of the name of the new interdisciplinary field of "nanotechnology". Essentially, "nanoelectronics" denotes the goal of shrinking electronic devices, such as diodes and transistors, as well as integrated circuits of such devices that can perform logical operations, down to dimensions in the range of 100 nanometers. The thirty-year hiatus in the development of nanotechnology can figuratively be seen as a period of waiting for the bottom-up and atomically precise construction skills of synthetic chemistry to meet the top-down reductionist aspirations of device physics. The sub-nanometer domain of nineteenth-century classical chemistry has steadily grown, and state-of-the-art supramolecular chemistry can achieve atomic precision in non-repeating molecular assemblies of the size desired for nanotechnology. For nanoelectronics in particular, a basic understanding of the electron transport properties of molecules must also be developed. Quantum chemistry provides powerful computational methods that can accurately predict the properties of small to medium sized molecules on a desktop workstation, and those of large molecules if one has access to a supercomputer. Of the many properties of a molecule that quantum chemistry routinely predicts, the ability to carry a current is one that had not even been considered until recently. "Currently", there is a controversy over just how to define this key property. Reminiscent of the situation in high-Tc superconductivity, much of the difficulty arises from the different models that are used to simplify the complex electronic structure of real materials. A model

  14. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  15. Towards quantum chemistry on a quantum computer.

    Science.gov (United States)

    Lanyon, B P; Whitfield, J D; Gillett, G G; Goggin, M E; Almeida, M P; Kassal, I; Biamonte, J D; Mohseni, M; Powell, B J; Barbieri, M; Aspuru-Guzik, A; White, A G

    2010-02-01

    Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.

  16. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  17. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale L

    2005-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  18. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale

    2004-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  19. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  20. High throughput route selection in multi-rate wireless mesh networks

    Institute of Scientific and Technical Information of China (English)

    WEI Yi-fei; GUO Xiang-li; SONG Mei; SONG Jun-de

    2008-01-01

    Most existing Ad-hoc routing protocols use the shortest path algorithm with a hop count metric to select paths. It is appropriate in single-rate wireless networks, but has a tendency to select paths containing long-distance links that have low data rates and reduced reliability in multi-rate networks. This article introduces a high throughput routing algorithm utilizing the multi-rate capability and some mesh characteristics in wireless fidelity (WiFi) mesh networks. It uses the medium access control (MAC) transmission time as the routing metric, which is estimated by the information passed up from the physical layer. When the proposed algorithm is adopted, the Ad-hoc on-demand distance vector (AODV) routing can be improved as high throughput AODV (HT-AODV). Simulation results show that HT-AODV is capable of establishing a route that has high data-rate, short end-to-end delay and great network throughput.

  1. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  2. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  3. High-throughput single nucleotide polymorphism genotyping using nanofluidic Dynamic Arrays

    Directory of Open Access Journals (Sweden)

    Crenshaw Andrew

    2009-01-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs have emerged as the genetic marker of choice for mapping disease loci and candidate gene association studies, because of their high density and relatively even distribution in the human genomes. There is a need for systems allowing medium multiplexing (ten to hundreds of SNPs with high throughput, which can efficiently and cost-effectively generate genotypes for a very large sample set (thousands of individuals. Methods that are flexible, fast, accurate and cost-effective are urgently needed. This is also important for those who work on high throughput genotyping in non-model systems where off-the-shelf assays are not available and a flexible platform is needed. Results We demonstrate the use of a nanofluidic Integrated Fluidic Circuit (IFC - based genotyping system for medium-throughput multiplexing known as the Dynamic Array, by genotyping 994 individual human DNA samples on 47 different SNP assays, using nanoliter volumes of reagents. Call rates of greater than 99.5% and call accuracies of greater than 99.8% were achieved from our study, which demonstrates that this is a formidable genotyping platform. The experimental set up is very simple, with a time-to-result for each sample of about 3 hours. Conclusion Our results demonstrate that the Dynamic Array is an excellent genotyping system for medium-throughput multiplexing (30-300 SNPs, which is simple to use and combines rapid throughput with excellent call rates, high concordance and low cost. The exceptional call rates and call accuracy obtained may be of particular interest to those working on validation and replication of genome- wide- association (GWA studies.

  4. HTTK: R Package for High-Throughput Toxicokinetics

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  5. Quantum Chemistry, and Eclectic Mix: From Silicon Carbide to Size Consistency

    Energy Technology Data Exchange (ETDEWEB)

    Rintelman, Jamie Marie [Iowa State Univ., Ames, IA (United States)

    2004-12-19

    Chemistry is a field of great breadth and variety. It is this diversity that makes for both an interesting and challenging field. My interests have spanned three major areas of theoretical chemistry: applications, method development, and method evaluation. The topics presented in this thesis are as follows: (1) a multi-reference study of the geometries and relative energies of four atom silicon carbide clusters in the gas phase; (2) the reaction of acetylene on the Si(100)-(2x1) surface; (3) an improvement to the Effective Fragment Potential (EFP) solvent model to enable the study of reactions in both aqueous and nonaqueous solution; and (4) an evaluation of the size consistency of Multireference Perturbation Theory (MRPT). In the following section, the author briefly discusses two topics central to, and present throughout, this thesis: Multi-reference methods and Quantum Mechanics/Molecular Mechanics (QM/MM) methods.

  6. High-throughput flow injection analysis mass spectroscopy with networked delivery of color-rendered results. 2. Three-dimensional spectral mapping of 96-well combinatorial chemistry racks.

    Science.gov (United States)

    Görlach, E; Richmond, R; Lewis, I

    1998-08-01

    For the last two years, the mass spectroscopy section of the Novartis Pharma Research Core Technology group has analyzed tens of thousands of multiple parallel synthesis samples from the Novartis Pharma Combinatorial Chemistry program, using an in-house developed automated high-throughput flow injection analysis electrospray ionization mass spectroscopy system. The electrospray spectra of these samples reflect the many structures present after the cleavage step from the solid support. The overall success of the sequential synthesis is mirrored in the purity of the expected end product, but the partial success of individual synthesis steps is evident in the impurities in the mass spectrum. However this latter reaction information, which is of considerable utility to the combinatorial chemist, is effectively hidden from view by the very large number of analyzed samples. This information is now revealed at the workbench of the combinatorial chemist by a novel three-dimensional display of each rack's complete mass spectral ion current using the in-house RackViewer Visual Basic application. Colorization of "forbidden loss" and "forbidden gas-adduct" zones, normalization to expected monoisotopic molecular weight, colorization of ionization intensity, and sorting by row or column were used in combination to highlight systematic patterns in the mass spectroscopy data.

  7. Industrial medicinal chemistry insights: neuroscience hit generation at Janssen.

    Science.gov (United States)

    Tresadern, Gary; Rombouts, Frederik J R; Oehlrich, Daniel; Macdonald, Gregor; Trabanco, Andres A

    2017-10-01

    The role of medicinal chemistry has changed over the past 10 years. Chemistry had become one step in a process; funneling the output of high-throughput screening (HTS) on to the next stage. The goal to identify the ideal clinical compound remains, but the means to achieve this have changed. Modern medicinal chemistry is responsible for integrating innovation throughout early drug discovery, including new screening paradigms, computational approaches, novel synthetic chemistry, gene-family screening, investigating routes of delivery, and so on. In this Foundation Review, we show how a successful medicinal chemistry team has a broad impact and requires multidisciplinary expertise in these areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  9. Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections

    Science.gov (United States)

    Tchorowski, Nicole; Murawski, Robert

    2014-01-01

    New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option

  10. High-throughput density functional calculations to optimize properties and interfacial chemistry of piezoelectric materials

    Science.gov (United States)

    Barr, Jordan A.; Lin, Fang-Yin; Ashton, Michael; Hennig, Richard G.; Sinnott, Susan B.

    2018-02-01

    High-throughput density functional theory calculations are conducted to search through 1572 A B O3 compounds to find a potential replacement material for lead zirconate titanate (PZT) that exhibits the same excellent piezoelectric properties as PZT and lacks both its use of the toxic element lead (Pb) and the formation of secondary alloy phases with platinum (Pt) electrodes. The first screening criterion employed a search through the Materials Project database to find A -B combinations that do not form ternary compounds with Pt. The second screening criterion aimed to eliminate potential candidates through first-principles calculations of their electronic structure, in which compounds with a band gap of 0.25 eV or higher were retained. Third, thermodynamic stability calculations were used to compare the candidates in a Pt environment to compounds already calculated to be stable within the Materials Project. Formation energies below or equal to 100 meV/atom were considered to be thermodynamically stable. The fourth screening criterion employed lattice misfit to identify those candidate perovskites that have low misfit with the Pt electrode and high misfit of potential secondary phases that can be formed when Pt alloys with the different A and B components. To aid in the final analysis, dynamic stability calculations were used to determine those perovskites that have dynamic instabilities that favor the ferroelectric distortion. Analysis of the data finds three perovskites warranting further investigation: CsNb O3 , RbNb O3 , and CsTa O3 .

  11. High-dimensional quantum cloning and applications to quantum hacking.

    Science.gov (United States)

    Bouchard, Frédéric; Fickler, Robert; Boyd, Robert W; Karimi, Ebrahim

    2017-02-01

    Attempts at cloning a quantum system result in the introduction of imperfections in the state of the copies. This is a consequence of the no-cloning theorem, which is a fundamental law of quantum physics and the backbone of security for quantum communications. Although perfect copies are prohibited, a quantum state may be copied with maximal accuracy via various optimal cloning schemes. Optimal quantum cloning, which lies at the border of the physical limit imposed by the no-signaling theorem and the Heisenberg uncertainty principle, has been experimentally realized for low-dimensional photonic states. However, an increase in the dimensionality of quantum systems is greatly beneficial to quantum computation and communication protocols. Nonetheless, no experimental demonstration of optimal cloning machines has hitherto been shown for high-dimensional quantum systems. We perform optimal cloning of high-dimensional photonic states by means of the symmetrization method. We show the universality of our technique by conducting cloning of numerous arbitrary input states and fully characterize our cloning machine by performing quantum state tomography on cloned photons. In addition, a cloning attack on a Bennett and Brassard (BB84) quantum key distribution protocol is experimentally demonstrated to reveal the robustness of high-dimensional states in quantum cryptography.

  12. High-throughput screening to identify inhibitors of lysine demethylases.

    Science.gov (United States)

    Gale, Molly; Yan, Qin

    2015-01-01

    Lysine demethylases (KDMs) are epigenetic regulators whose dysfunction is implicated in the pathology of many human diseases including various types of cancer, inflammation and X-linked intellectual disability. Particular demethylases have been identified as promising therapeutic targets, and tremendous efforts are being devoted toward developing suitable small-molecule inhibitors for clinical and research use. Several High-throughput screening strategies have been developed to screen for small-molecule inhibitors of KDMs, each with advantages and disadvantages in terms of time, cost, effort, reliability and sensitivity. In this Special Report, we review and evaluate the High-throughput screening methods utilized for discovery of novel small-molecule KDM inhibitors.

  13. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  14. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  15. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    Science.gov (United States)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  16. Complementing high-throughput X-ray powder diffraction data with quantum-chemical calculations

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; van de Streek, Jacco; Rantanen, Jukka

    2012-01-01

    of piroxicam form III. These combined experimental/quantum-chemical methods can provide access to reliable structural information in the course of an intensive experimentally based solid-form screening activity or in other circumstances wherein single crystals might never be viable, for example, for polymorphs...

  17. Solion ion source for high-efficiency, high-throughput solar cell manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Koo, John, E-mail: john-koo@amat.com; Binns, Brant; Miller, Timothy; Krause, Stephen; Skinner, Wesley; Mullin, James [Applied Materials, Inc., Varian Semiconductor Equipment Business Unit, 35 Dory Road, Gloucester, Massachusetts 01930 (United States)

    2014-02-15

    In this paper, we introduce the Solion ion source for high-throughput solar cell doping. As the source power is increased to enable higher throughput, negative effects degrade the lifetime of the plasma chamber and the extraction electrodes. In order to improve efficiency, we have explored a wide range of electron energies and determined the conditions which best suit production. To extend the lifetime of the source we have developed an in situ cleaning method using only existing hardware. With these combinations, source life-times of >200 h for phosphorous and >100 h for boron ion beams have been achieved while maintaining 1100 cell-per-hour production.

  18. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  19. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  20. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  1. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  2. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  3. Inventory management and reagent supply for automated chemistry.

    Science.gov (United States)

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  4. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  5. Advances in high temperature chemistry 1

    CERN Document Server

    Eyring, Leroy

    2013-01-01

    Advances in High Temperature Chemistry, Volume 1 describes the complexities and special and changing characteristics of high temperature chemistry. After providing a brief definition of high temperature chemistry, this nine-chapter book goes on describing the experiments and calculations of diatomic transition metal molecules, as well as the advances in applied wave mechanics that may contribute to an understanding of the bonding, structure, and spectra of the molecules of high temperature interest. The next chapter provides a summary of gaseous ternary compounds of the alkali metals used in

  6. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    Science.gov (United States)

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047

  7. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  8. Chemistry of natural products: A veritable approach to the ...

    African Journals Online (AJOL)

    Even with the advent of newer technologies such as combinatorial chemistry, robotics, high throughput screening (HTS), bioinformatics, and in silico molecular modelling, natural products still play a crucial role in drug discovery. This is because they provide an unparalleled range of chemical diversity on which the newer ...

  9. Automated quantum chemistry based molecular dynamics simulations of electron ionization induced fragmentations of the nucleobases Uracil, Thymine, Cytosine, and Guanine.

    Science.gov (United States)

    Grimme, Stefan; Bauer, Christopher Alexander

    2015-01-01

    The gas-phase decomposition pathways of electron ionization (EI)-induced radical cations of the nucleobases uracil, thymine, cytosine, and guanine are investigated by means of mixed quantum-classical molecular dynamics. No preconceived fragmentation channels are used in the calculations. The results compare well to a plethora of experimental and theoretical data for these important biomolecules. With our combined stochastic and dynamic approach, one can access in an unbiased way the energetically available decomposition mechanisms. Additionally, we are able to separate the EI mass spectra of different tautomers of cytosine and guanine. Our method (previously termed quantum chemistry electron ionization mass spectra) reproduces free nucleobase experimental mass spectra well and provides detailed mechanistic in-sight into high-energy unimolecular decomposition processes.

  10. Sol-Gel Chemistry for Carbon Dots.

    Science.gov (United States)

    Malfatti, Luca; Innocenzi, Plinio

    2018-03-14

    Carbon dots are an emerging class of carbon-based nanostructures produced by low-cost raw materials which exhibit a widely-tunable photoluminescence and a high quantum yield. The potential of these nanomaterials as a substitute of semiconductor quantum dots in optoelectronics and biomedicine is very high, however they need a customized chemistry to be integrated in host-guest systems or functionalized in core-shell structures. This review is focused on recent advances of the sol-gel chemistry applied to the C-dots technology. The surface modification, the fine tailoring of the chemical composition and the embedding into a complex nanostructured material are the main targets of combining sol-gel processing with C-dots chemistry. In addition, the synergistic effect of the sol-gel precursor combined with the C-dots contribute to modify the intrinsic chemo-physical properties of the dots, empowering the emission efficiency or enabling the tuning of the photoluminescence over a wide range of the visible spectrum. © 2018 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Elementary and brief introduction of hadronic chemistry

    Science.gov (United States)

    Tangde, Vijay M.

    2013-10-01

    The discipline, today known as Quantum Chemistry for atomic and subatomic level interactions has no doubt made a significant historical contributions to the society. Despite of its significant achievements, quantum chemistry is also known for its widespread denial of insufficiencies it inherits. An Italian-American Scientist Professor Ruggero Maria Santilli during his more than five decades of dedicated and sustained research has denounced the fact that quantum chemistry is mostly based on mere nomenclatures without any quantitative scientific contents. Professor R M Santilli first formulated the iso-, geno- and hyper-mathematics [1-4] that helped in understanding numerous diversified problems and removing inadequacies in most of the established and celebrated theories of 20th century physics and chemistry. This involves the isotopic, genotopic, etc. lifting of Lie algebra that generated Lie admissible mathematics to properly describe irreversible processes. The studies on Hadronic Mechanics in general and chemistry in particular based on Santilli's mathematics[3-5] for the first time has removed the very fundamental limitations of quantum chemistry [2, 6-8]. In the present discussion, we have briefly reviewed the conceptual foundations of Hadronic Chemistry that imparts the completeness to the Quantum Chemistry via an addition of effects at distances of the order of 1 fm (only) which are assumed to be Non-linear, Non-local, Non-potential, Non-hamiltonian and thus Non-unitary and its application in development of a new chemical species called Magnecules.

  12. Current status and future prospects for enabling chemistry technology in the drug discovery process.

    Science.gov (United States)

    Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  13. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  14. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  15. Parallel algorithms for quantum chemistry. I. Integral transformations on a hypercube multiprocessor

    International Nuclear Information System (INIS)

    Whiteside, R.A.; Binkley, J.S.; Colvin, M.E.; Schaefer, H.F. III

    1987-01-01

    For many years it has been recognized that fundamental physical constraints such as the speed of light will limit the ultimate speed of single processor computers to less than about three billion floating point operations per second (3 GFLOPS). This limitation is becoming increasingly restrictive as commercially available machines are now within an order of magnitude of this asymptotic limit. A natural way to avoid this limit is to harness together many processors to work on a single computational problem. In principle, these parallel processing computers have speeds limited only by the number of processors one chooses to acquire. The usefulness of potentially unlimited processing speed to a computationally intensive field such as quantum chemistry is obvious. If these methods are to be applied to significantly larger chemical systems, parallel schemes will have to be employed. For this reason we have developed distributed-memory algorithms for a number of standard quantum chemical methods. We are currently implementing these on a 32 processor Intel hypercube. In this paper we present our algorithm and benchmark results for one of the bottleneck steps in quantum chemical calculations: the four index integral transformation

  16. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  17. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  18. A high-throughput, multi-channel photon-counting detector with picosecond timing

    International Nuclear Information System (INIS)

    Lapington, J.S.; Fraser, G.W.; Miller, G.M.; Ashton, T.J.R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  19. A high-throughput reactor system for optimization of Mo–V–Nb mixed oxide catalyst composition in ethane ODH

    KAUST Repository

    Zhu, Haibo; Laveille, Paco; Rosenfeld, Devon C.; Hedhili, Mohamed N.; Basset, Jean-Marie

    2015-01-01

    75 Mo-V-Nb mixed oxide catalysts with a broad range of compositions were prepared by a simple evaporation method, and were screened for the ethane oxidative dehydrogenation (ODH) reaction. The compositions of these 75 catalysts were systematically changed by varying the Nb loading, and the Mo/V molar ratio. Characterization by XRD, XPS, H2-TPR and SEM revealed that an intimate structure is formed among the 3 components. The strong interaction among different components leads to the formation of a new phase or an "intimate structure". The dependency of conversion and selectivity on the catalyst composition was clearly demonstrated from the results of high-throughput testing. The optimized Mo-V-Nb molar composition was confirmed to be composed of a Nb content of 4-8%, a Mo content of 70-83%, and a V content of 12-25%. The enhanced catalytic performance of the mixed oxides is obviously due to the synergistic effects of the different components. The optimized compositions for ethane ODH revealed in our high-throughput tests and the structural information provided by our characterization studies can serve as the starting point for future efforts to improve the catalytic performance of Mo-V-Nb oxides. This journal is © The Royal Society of Chemistry.

  20. Fluorescence-based high-throughput screening of dicer cleavage activity.

    Science.gov (United States)

    Podolska, Katerina; Sedlak, David; Bartunek, Petr; Svoboda, Petr

    2014-03-01

    Production of small RNAs by ribonuclease III Dicer is a key step in microRNA and RNA interference pathways, which employ Dicer-produced small RNAs as sequence-specific silencing guides. Further studies and manipulations of microRNA and RNA interference pathways would benefit from identification of small-molecule modulators. Here, we report a study of a fluorescence-based in vitro Dicer cleavage assay, which was adapted for high-throughput screening. The kinetic assay can be performed under single-turnover conditions (35 nM substrate and 70 nM Dicer) in a small volume (5 µL), which makes it suitable for high-throughput screening in a 1536-well format. As a proof of principle, a small library of bioactive compounds was analyzed, demonstrating potential of the assay.

  1. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    Science.gov (United States)

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  2. High-fidelity quantum driving

    DEFF Research Database (Denmark)

    Bason, Mark George; Viteau, Matthieu; Malossi, Nicola

    2011-01-01

    Accurately controlling a quantum system is a fundamental requirement in quantum information processing and the coherent manipulation of molecular systems. The ultimate goal in quantum control is to prepare a desired state with the highest fidelity allowed by the available resources...... and the experimental constraints. Here we experimentally implement two optimal high-fidelity control protocols using a two-level quantum system comprising Bose–Einstein condensates in optical lattices. The first is a short-cut protocol that reaches the maximum quantum-transformation speed compatible...

  3. Proceedings of the meeting on tunneling reaction and low temperature chemistry, 97 October. Tunneling reaction and quantum medium

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, Tetsuo; Aratono, Yasuyuki; Ichikawa, Tsuneki; Shiotani, Masaru [eds.

    1998-02-01

    Present report is the proceedings of the 3rd Meeting on Tunneling Reaction and Low Temperature Chemistry held in Oct. 13 and 14, 1997. The main subject of the meeting is `Tunneling Reaction and Quantum Medium`. In the meeting, the physical and chemical phenomena in the liquid helium such as quantum nucleation, spectroscopy of atoms and molecules, and tunneling abstraction reaction of tritium atom were discussed as the main topics as well as the tunneling reactions in the solid hydrogen and organic compounds. Through the meetings held in 1995, 1996, and 1997, the tunneling phenomena proceeding at various temperatures (room temperature to mK) in the wide fields of chemistry, biology, and physics were discussed intensively and the importance of the tunneling phenomena in the science has been getting clear. The 12 of the presented papers are indexed individually. (J.P.N.)

  4. Proceedings of the meeting on tunneling reaction and low temperature chemistry, 97 October. Tunneling reaction and quantum medium

    International Nuclear Information System (INIS)

    Miyazaki, Tetsuo; Aratono, Yasuyuki; Ichikawa, Tsuneki; Shiotani, Masaru

    1998-02-01

    Present report is the proceedings of the 3rd Meeting on Tunneling Reaction and Low Temperature Chemistry held in Oct. 13 and 14, 1997. The main subject of the meeting is 'Tunneling Reaction and Quantum Medium'. In the meeting, the physical and chemical phenomena in the liquid helium such as quantum nucleation, spectroscopy of atoms and molecules, and tunneling abstraction reaction of tritium atom were discussed as the main topics as well as the tunneling reactions in the solid hydrogen and organic compounds. Through the meetings held in 1995, 1996, and 1997, the tunneling phenomena proceeding at various temperatures (room temperature to mK) in the wide fields of chemistry, biology, and physics were discussed intensively and the importance of the tunneling phenomena in the science has been getting clear. The 12 of the presented papers are indexed individually. (J.P.N.)

  5. Evaluation of Capacity on a High Throughput Vol-oxidizer for Operability

    International Nuclear Information System (INIS)

    Kim, Young Hwan; Park, Geun Il; Lee, Jung Won; Jung, Jae Hoo; Kim, Ki Ho; Lee, Yong Soon; Lee, Do Youn; Kim, Su Sung

    2010-01-01

    KAERI is developing a pyro-process. As a piece of process equipment, a high throughput vol-oxidizer which can handle a several tens kg HM/batch was developed to supply U 3 O 8 powders to an electrolytic reduction(ER) reactor. To increase the reduction yield, UO 2 pellets should be converted into uniform powders. In this paper, we aim at the evaluation of a high throughput vol-oxidizer for operability. The evaluation consisted of 3 targets, a mechanical motion test, a heating test and hull separation test. In order to test a high throughput vol-oxidizer, By using a control system, mechanical motion tests of the vol-oxidizer were conducted, and heating rates were analyzed. Also the separation tests of hulls for recovery rate were conducted. The test results of the vol-oxidizer are going to be applied for operability. A study on the characteristics of the volatile gas produced during a vol-oxidation process is not included in this study

  6. Fun with High Throughput Toxicokinetics (CalEPA webinar)

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  7. Screening small-molecule compound microarrays for protein ligands without fluorescence labeling with a high-throughput scanning microscope.

    Science.gov (United States)

    Fei, Yiyan; Landry, James P; Sun, Yungshin; Zhu, Xiangdong; Wang, Xiaobing; Luo, Juntao; Wu, Chun-Yi; Lam, Kit S

    2010-01-01

    We describe a high-throughput scanning optical microscope for detecting small-molecule compound microarrays on functionalized glass slides. It is based on measurements of oblique-incidence reflectivity difference and employs a combination of a y-scan galvometer mirror and an x-scan translation stage with an effective field of view of 2 cm x 4 cm. Such a field of view can accommodate a printed small-molecule compound microarray with as many as 10,000 to 20,000 targets. The scanning microscope is capable of measuring kinetics as well as endpoints of protein-ligand reactions simultaneously. We present the experimental results on solution-phase protein reactions with small-molecule compound microarrays synthesized from one-bead, one-compound combinatorial chemistry and immobilized on a streptavidin-functionalized glass slide.

  8. A frequency and sensitivity tunable microresonator array for high-speed quantum processor readout

    Energy Technology Data Exchange (ETDEWEB)

    Whittaker, J. D., E-mail: jwhittaker@dwavesys.com; Swenson, L. J.; Volkmann, M. H.; Spear, P.; Altomare, F.; Berkley, A. J.; Bunyk, P.; Harris, R.; Hilton, J. P.; Hoskinson, E.; Johnson, M. W.; Ladizinsky, E.; Lanting, T.; Oh, T.; Perminov, I.; Tolkacheva, E.; Yao, J. [D-Wave Systems, Inc., Burnaby, British Columbia V5G 4M9 (Canada); Bumble, B.; Day, P. K.; Eom, B. H. [Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California 91109 (United States); and others

    2016-01-07

    Superconducting microresonators have been successfully utilized as detection elements for a wide variety of applications. With multiplexing factors exceeding 1000 detectors per transmission line, they are the most scalable low-temperature detector technology demonstrated to date. For high-throughput applications, fewer detectors can be coupled to a single wire but utilize a larger per-detector bandwidth. For all existing designs, fluctuations in fabrication tolerances result in a non-uniform shift in resonance frequency and sensitivity, which ultimately limits the efficiency of bandwidth utilization. Here, we present the design, implementation, and initial characterization of a superconducting microresonator readout integrating two tunable inductances per detector. We demonstrate that these tuning elements provide independent control of both the detector frequency and sensitivity, allowing us to maximize the transmission line bandwidth utilization. Finally, we discuss the integration of these detectors in a multilayer fabrication stack for high-speed readout of the D-Wave quantum processor, highlighting the use of control and routing circuitry composed of single-flux-quantum loops to minimize the number of control wires at the lowest temperature stage.

  9. Symposium on high temperature and materials chemistry

    International Nuclear Information System (INIS)

    1989-10-01

    This volume contains the written proceedings of the Symposium on High Temperature and Materials Chemistry held in Berkeley, California on October 24--25, 1989. The Symposium was sponsored by the Materials and Chemical Sciences Division of Lawrence Berkeley Laboratory and by the College of Chemistry of the University of California at Berkeley to discuss directions, trends, and accomplishments in the field of high temperature and materials chemistry. Its purpose was to provide a snapshot of high temperature and materials chemistry and, in so doing, to define status and directions

  10. Symposium on high temperature and materials chemistry

    Energy Technology Data Exchange (ETDEWEB)

    1989-10-01

    This volume contains the written proceedings of the Symposium on High Temperature and Materials Chemistry held in Berkeley, California on October 24--25, 1989. The Symposium was sponsored by the Materials and Chemical Sciences Division of Lawrence Berkeley Laboratory and by the College of Chemistry of the University of California at Berkeley to discuss directions, trends, and accomplishments in the field of high temperature and materials chemistry. Its purpose was to provide a snapshot of high temperature and materials chemistry and, in so doing, to define status and directions.

  11. Surface Plasmon Resonance: New Biointerface Designs and High-Throughput Affinity Screening

    Science.gov (United States)

    Linman, Matthew J.; Cheng, Quan Jason

    Surface plasmon resonance (SPR) is a surface optical technique that measures minute changes in refractive index at a metal-coated surface. It has become increasingly popular in the study of biological and chemical analytes because of its label-free measurement feature. In addition, SPR allows for both quantitative and qualitative assessment of binding interactions in real time, making it ideally suited for probing weak interactions that are often difficult to study with other methods. This chapter presents the biosensor development in the last 3 years or so utilizing SPR as the principal analytical technique, along with a concise background of the technique itself. While SPR has demonstrated many advantages, it is a nonselective method and so, building reproducible and functional interfaces is vital to sensing applications. This chapter, therefore, focuses mainly on unique surface chemistries and assay approaches to examine biological interactions with SPR. In addition, SPR imaging for high-throughput screening based on microarrays and novel hyphenated techniques involving the coupling of SPR to other analytical methods is discussed. The chapter concludes with a commentary on the current state of SPR biosensing technology and the general direction of future biosensor research.

  12. A photoelectron imaging and quantum chemistry study of the deprotonated indole anion.

    Science.gov (United States)

    Parkes, Michael A; Crellin, Jonathan; Henley, Alice; Fielding, Helen H

    2018-05-29

    Indole is an important molecular motif in many biological molecules and exists in its deprotonated anionic form in the cyan fluorescent protein, an analogue of green fluorescent protein. However, the electronic structure of the deprotonated indole anion has been relatively unexplored. Here, we use a combination of anion photoelectron velocity-map imaging measurements and quantum chemistry calculations to probe the electronic structure of the deprotonated indole anion. We report vertical detachment energies (VDEs) of 2.45 ± 0.05 eV and 3.20 ± 0.05 eV, respectively. The value for D0 is in agreement with recent high-resolution measurements whereas the value for D1 is a new measurement. We find that the first electronically excited singlet state of the anion, S1(ππ*), lies above the VDE and has shape resonance character with respect to the D0 detachment continuum and Feshbach resonance character with respect to the D1 continuum.

  13. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    Science.gov (United States)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  14. Current status and future prospects for enabling chemistry technology in the drug discovery process

    Science.gov (United States)

    Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094

  15. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  16. Towards low-delay and high-throughput cognitive radio vehicular networks

    Directory of Open Access Journals (Sweden)

    Nada Elgaml

    2017-12-01

    Full Text Available Cognitive Radio Vehicular Ad-hoc Networks (CR-VANETs exploit cognitive radios to allow vehicles to access the unused channels in their radio environment. Thus, CR-VANETs do not only suffer the traditional CR problems, especially spectrum sensing, but also suffer new challenges due to the highly dynamic nature of VANETs. In this paper, we present a low-delay and high-throughput radio environment assessment scheme for CR-VANETs that can be easily incorporated with the IEEE 802.11p standard developed for VANETs. Simulation results show that the proposed scheme significantly reduces the time to get the radio environment map and increases the CR-VANET throughput.

  17. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    International Nuclear Information System (INIS)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-01-01

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform

  18. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  19. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  20. Analysis of temporal evolution of quantum dot surface chemistry by surface-enhanced Raman scattering.

    Science.gov (United States)

    Doğan, İlker; Gresback, Ryan; Nozaki, Tomohiro; van de Sanden, Mauritius C M

    2016-07-08

    Temporal evolution of surface chemistry during oxidation of silicon quantum dot (Si-QD) surfaces were probed using surface-enhanced Raman scattering (SERS). A monolayer of hydrogen and chlorine terminated plasma-synthesized Si-QDs were spin-coated on silver oxide thin films. A clearly enhanced signal of surface modes, including Si-Clx and Si-Hx modes were observed from as-synthesized Si-QDs as a result of the plasmonic enhancement of the Raman signal at Si-QD/silver oxide interface. Upon oxidation, a gradual decrease of Si-Clx and Si-Hx modes, and an emergence of Si-Ox and Si-O-Hx modes have been observed. In addition, first, second and third transverse optical modes of Si-QDs were also observed in the SERS spectra, revealing information on the crystalline morphology of Si-QDs. An absence of any of the abovementioned spectral features, but only the first transverse optical mode of Si-QDs from thick Si-QD films validated that the spectral features observed from Si-QDs on silver oxide thin films are originated from the SERS effect. These results indicate that real-time SERS is a powerful diagnostic tool and a novel approach to probe the dynamic surface/interface chemistry of quantum dots, especially when they involve in oxidative, catalytic, and electrochemical surface/interface reactions.

  1. The Quixote project: Collaborative and Open Quantum Chemistry data management in the Internet age.

    Science.gov (United States)

    Adams, Sam; de Castro, Pablo; Echenique, Pablo; Estrada, Jorge; Hanwell, Marcus D; Murray-Rust, Peter; Sherwood, Paul; Thomas, Jens; Townsend, Joe

    2011-10-14

    Computational Quantum Chemistry has developed into a powerful, efficient, reliable and increasingly routine tool for exploring the structure and properties of small to medium sized molecules. Many thousands of calculations are performed every day, some offering results which approach experimental accuracy. However, in contrast to other disciplines, such as crystallography, or bioinformatics, where standard formats and well-known, unified databases exist, this QC data is generally destined to remain locally held in files which are not designed to be machine-readable. Only a very small subset of these results will become accessible to the wider community through publication.In this paper we describe how the Quixote Project is developing the infrastructure required to convert output from a number of different molecular quantum chemistry packages to a common semantically rich, machine-readable format and to build respositories of QC results. Such an infrastructure offers benefits at many levels. The standardised representation of the results will facilitate software interoperability, for example making it easier for analysis tools to take data from different QC packages, and will also help with archival and deposition of results. The repository infrastructure, which is lightweight and built using Open software components, can be implemented at individual researcher, project, organisation or community level, offering the exciting possibility that in future many of these QC results can be made publically available, to be searched and interpreted just as crystallography and bioinformatics results are today.Although we believe that quantum chemists will appreciate the contribution the Quixote infrastructure can make to the organisation and and exchange of their results, we anticipate that greater rewards will come from enabling their results to be consumed by a wider community. As the respositories grow they will become a valuable source of chemical data for use by other

  2. The Quixote project: Collaborative and Open Quantum Chemistry data management in the Internet age

    Directory of Open Access Journals (Sweden)

    Adams Sam

    2011-10-01

    Full Text Available Abstract Computational Quantum Chemistry has developed into a powerful, efficient, reliable and increasingly routine tool for exploring the structure and properties of small to medium sized molecules. Many thousands of calculations are performed every day, some offering results which approach experimental accuracy. However, in contrast to other disciplines, such as crystallography, or bioinformatics, where standard formats and well-known, unified databases exist, this QC data is generally destined to remain locally held in files which are not designed to be machine-readable. Only a very small subset of these results will become accessible to the wider community through publication. In this paper we describe how the Quixote Project is developing the infrastructure required to convert output from a number of different molecular quantum chemistry packages to a common semantically rich, machine-readable format and to build respositories of QC results. Such an infrastructure offers benefits at many levels. The standardised representation of the results will facilitate software interoperability, for example making it easier for analysis tools to take data from different QC packages, and will also help with archival and deposition of results. The repository infrastructure, which is lightweight and built using Open software components, can be implemented at individual researcher, project, organisation or community level, offering the exciting possibility that in future many of these QC results can be made publically available, to be searched and interpreted just as crystallography and bioinformatics results are today. Although we believe that quantum chemists will appreciate the contribution the Quixote infrastructure can make to the organisation and and exchange of their results, we anticipate that greater rewards will come from enabling their results to be consumed by a wider community. As the respositories grow they will become a valuable source of

  3. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si......RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...

  4. Integrating Computational Chemistry into a Course in Classical Thermodynamics

    Science.gov (United States)

    Martini, Sheridan R.; Hartzell, Cynthia J.

    2015-01-01

    Computational chemistry is commonly addressed in the quantum mechanics course of undergraduate physical chemistry curricula. Since quantum mechanics traditionally follows the thermodynamics course, there is a lack of curricula relating computational chemistry to thermodynamics. A method integrating molecular modeling software into a semester long…

  5. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  6. Quantitative in vitro-to-in vivo extrapolation in a high-throughput environment

    International Nuclear Information System (INIS)

    Wetmore, Barbara A.

    2015-01-01

    High-throughput in vitro toxicity screening provides an efficient way to identify potential biological targets for environmental and industrial chemicals while conserving limited testing resources. However, reliance on the nominal chemical concentrations in these in vitro assays as an indicator of bioactivity may misrepresent potential in vivo effects of these chemicals due to differences in clearance, protein binding, bioavailability, and other pharmacokinetic factors. Development of high-throughput in vitro hepatic clearance and protein binding assays and refinement of quantitative in vitro-to-in vivo extrapolation (QIVIVE) methods have provided key tools to predict xenobiotic steady state pharmacokinetics. Using a process known as reverse dosimetry, knowledge of the chemical steady state behavior can be incorporated with HTS data to determine the external in vivo oral exposure needed to achieve internal blood concentrations equivalent to those eliciting bioactivity in the assays. These daily oral doses, known as oral equivalents, can be compared to chronic human exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. This review will describe the use of QIVIVE methods in a high-throughput environment and the promise they hold in shaping chemical testing priorities and, potentially, high-throughput risk assessment strategies

  7. Photodissociation of ultracold diatomic strontium molecules with quantum state control.

    Science.gov (United States)

    McDonald, M; McGuyer, B H; Apfelbeck, F; Lee, C-H; Majewska, I; Moszynski, R; Zelevinsky, T

    2016-07-07

    Chemical reactions at ultracold temperatures are expected to be dominated by quantum mechanical effects. Although progress towards ultracold chemistry has been made through atomic photoassociation, Feshbach resonances and bimolecular collisions, these approaches have been limited by imperfect quantum state selectivity. In particular, attaining complete control of the ground or excited continuum quantum states has remained a challenge. Here we achieve this control using photodissociation, an approach that encodes a wealth of information in the angular distribution of outgoing fragments. By photodissociating ultracold (88)Sr2 molecules with full control of the low-energy continuum, we access the quantum regime of ultracold chemistry, observing resonant and nonresonant barrier tunnelling, matter-wave interference of reaction products and forbidden reaction pathways. Our results illustrate the failure of the traditional quasiclassical model of photodissociation and instead are accurately described by a quantum mechanical model. The experimental ability to produce well-defined quantum continuum states at low energies will enable high-precision studies of long-range molecular potentials for which accurate quantum chemistry models are unavailable, and may serve as a source of entangled states and coherent matter waves for a wide range of experiments in quantum optics.

  8. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide...... the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. Availability and implementation: We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template...

  9. Quality control methodology for high-throughput protein-protein interaction screening.

    Science.gov (United States)

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  10. High-throughput screening to enhance oncolytic virus immunotherapy

    Directory of Open Access Journals (Sweden)

    Allan KJ

    2016-04-01

    Full Text Available KJ Allan,1,2 David F Stojdl,1–3 SL Swift1 1Children’s Hospital of Eastern Ontario (CHEO Research Institute, 2Department of Biology, Microbiology and Immunology, 3Department of Pediatrics, University of Ottawa, Ottawa, ON, Canada Abstract: High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. Keywords: oncolytic, virus, screen, high-throughput, cancer, chemical, genomic, immunotherapy

  11. Development of a High-Throughput Screen for Inhibitors of Epstein-Barr Virus EBNA1

    Science.gov (United States)

    Thompson, Scott; Messick, Troy; Schultz, David C.; Reichman, Melvin; Lieberman, Paul M.

    2012-01-01

    Latent infection with Epstein-Barr Virus (EBV) is a carcinogenic cofactor in several lymphoid and epithelial cell malignancies. At present, there are no small molecule inhibitors that specifically target EBV latent infection or latency-associated oncoproteins. EBNA1 is an EBV-encoded sequence-specific DNA-binding protein that is consistently expressed in EBV-associated tumors and required for stable maintenance of the viral genome in proliferating cells. EBNA1 is also thought to provide cell survival function in latently infected cells. In this work we describe the development of a biochemical high-throughput screening (HTS) method using a homogenous fluorescence polarization (FP) assay monitoring EBNA1 binding to its cognate DNA binding site. An FP-based counterscreen was developed using another EBV-encoded DNA binding protein, Zta, and its cognate DNA binding site. We demonstrate that EBNA1 binding to a fluorescent labeled DNA probe provides a robust assay with a Z-factor consistently greater than 0.6. A pilot screen of a small molecule library of ~14,000 compounds identified 3 structurally related molecules that selectively inhibit EBNA1, but not Zta. All three compounds had activity in a cell-based assay specific for the disruption of EBNA1 transcription repression function. One of the compounds was effective in reducing EBV genome copy number in Raji Burkitt lymphoma cells. These experiments provide a proof-of-concept that small molecule inhibitors of EBNA1 can be identified by biochemical high-throughput screening of compound libraries. Further screening in conjunction with medicinal chemistry optimization may provide a selective inhibitor of EBNA1 and EBV latent infection. PMID:20930215

  12. Spins in chemistry

    CERN Document Server

    McWeeny, Roy

    2004-01-01

    Originally delivered as a series of lectures, this volume systematically traces the evolution of the ""spin"" concept from its role in quantum mechanics to its assimilation into the field of chemistry. Author Roy McWeeny presents an in-depth illustration of the deductive methods of quantum theory and their application to spins in chemistry, following the path from the earliest concepts to the sophisticated physical methods employed in the investigation of molecular structure and properties. Starting with the origin and development of the spin concept, the text advances to an examination of sp

  13. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels....... A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct...... characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion...

  14. Quantum secure direct communication with high-dimension quantum superdense coding

    International Nuclear Information System (INIS)

    Wang Chuan; Li Yansong; Liu Xiaoshu; Deng Fuguo; Long Guilu

    2005-01-01

    A protocol for quantum secure direct communication with quantum superdense coding is proposed. It combines the ideas of block transmission, the ping-pong quantum secure direct communication protocol, and quantum superdense coding. It has the advantage of being secure and of high source capacity

  15. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...

  16. A High Five for ChemistryOpen.

    Science.gov (United States)

    Peralta, David; Ortúzar, Natalia

    2016-02-01

    Fabulous at five! When ChemistryOpen was launched in 2011, it was the first society-owned general chemistry journal to publish open-access articles exclusively. Five years down the line, it has featured excellent work in all fields of chemistry, leading to an impressive first full impact factor of 3.25. In this Editorial, read about how ChemistryOpen has grown over the past five years and made its mark as a high-quality open-access journal with impact.

  17. Achieving high data throughput in research networks

    International Nuclear Information System (INIS)

    Matthews, W.; Cottrell, L.

    2001-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155 Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  18. Achieving High Data Throughput in Research Networks

    International Nuclear Information System (INIS)

    Matthews, W

    2004-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  19. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  20. Relativistic quantum chemistry on quantum computers

    Czech Academy of Sciences Publication Activity Database

    Veis, Libor; Višňák, Jakub; Fleig, T.; Knecht, S.; Saue, T.; Visscher, L.; Pittner, Jiří

    2012-01-01

    Roč. 85, č. 3 (2012), 030304 ISSN 1050-2947 R&D Projects: GA ČR GA203/08/0626 Institutional support: RVO:61388955 Keywords : simulation * algorithm * computation Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.042, year: 2012

  1. Quantum Chemistry on Quantum Computers: A Polynomial-Time Quantum Algorithm for Constructing the Wave Functions of Open-Shell Molecules.

    Science.gov (United States)

    Sugisaki, Kenji; Yamamoto, Satoru; Nakazawa, Shigeaki; Toyota, Kazuo; Sato, Kazunobu; Shiomi, Daisuke; Takui, Takeji

    2016-08-18

    Quantum computers are capable to efficiently perform full configuration interaction (FCI) calculations of atoms and molecules by using the quantum phase estimation (QPE) algorithm. Because the success probability of the QPE depends on the overlap between approximate and exact wave functions, efficient methods to prepare accurate initial guess wave functions enough to have sufficiently large overlap with the exact ones are highly desired. Here, we propose a quantum algorithm to construct the wave function consisting of one configuration state function, which is suitable for the initial guess wave function in QPE-based FCI calculations of open-shell molecules, based on the addition theorem of angular momentum. The proposed quantum algorithm enables us to prepare the wave function consisting of an exponential number of Slater determinants only by a polynomial number of quantum operations.

  2. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  3. Fluorescent-magnetic dual-encoded nanospheres: a promising tool for fast-simultaneous-addressable high-throughput analysis

    Science.gov (United States)

    Xie, Min; Hu, Jun; Wen, Cong-Ying; Zhang, Zhi-Ling; Xie, Hai-Yan; Pang, Dai-Wen

    2012-01-01

    Bead-based optical encoding or magnetic encoding techniques are promising in high-throughput multiplexed detection and separation of numerous species under complicated conditions. Therefore, a self-assembly strategy implemented in an organic solvent is put forward to fabricate fluorescent-magnetic dual-encoded nanospheres. Briefly, hydrophobic trioctylphosphine oxide-capped CdSe/ZnS quantum dots (QDs) and oleic acid-capped nano-γ-Fe2O3 magnetic particles are directly, selectively and controllably assembled on branched poly(ethylene imine)-coated nanospheres without any pretreatment, which is crucial to keep the high quantum yield of QDs and good dispersibility of γ-Fe2O3. Owing to the tunability of coating amounts of QDs and γ-Fe2O3 as well as controllable fluorescent emissions of deposited-QDs, dual-encoded nanospheres with different photoluminescent emissions and gradient magnetic susceptibility are constructed. Using this improved layer-by-layer self-assembly approach, deposition of hydrophobic nanoparticles onto hydrophilic carriers in organic media can be easily realized; meanwhile, fluorescent-magnetic dual-functional nanospheres can be further equipped with readable optical and magnetic addresses. The resultant fluorescent-magnetic dual-encoded nanospheres possess both the unique optical properties of QDs and the superparamagnetic properties of γ-Fe2O3, exhibiting good monodispersibility, huge encoding capacity and nanoscale particle size. Compared with the encoded microbeads reported by others, the nanometre scale of the dual-encoded nanospheres gives them minimum steric hindrance and higher flexibility.

  4. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  5. Spectroscopy and Chemistry of Cold Molecules

    Science.gov (United States)

    Momose, Takamasa

    2012-06-01

    Molecules at low temperatures are expected to behave quite differently from those at high temperatures because pronounced quantum effects emerge from thermal averages. Even at 10 K, a significant enhancement of reaction cross section is expected due to tunneling and resonance effects. Chemistry at this temperature is very important in order to understand chemical reactions in interstellar molecular clouds. At temperatures lower than 1 K, collisions and intermolecular interactions become qualitatively different from those at high temperatures because of the large thermal de Broglie wavelength of molecules. Collisions at these temperatures must be treated as the interference of molecular matter waves, but not as hard sphere collisions. A Bose-Einstein condensate is a significant state of matter as a result of coherent matter wave interaction. Especially, dense para-H_2 molecules are predicted to become a condensate even around 1 K. A convenient method to investigate molecules around 1 K is to dope molecules in cold matrices. Among various matrices, quantum hosts such as solid para-H_2 and superfluid He nano-droplets have been proven to be an excellent host for high-resolution spectroscopy. Rovibrational motion of molecules in these quantum hosts is well quantized on account of the weak interactions and the softness of quantum environment. The linewidths of infrared spectra of molecules in the quantum hosts are extremely narrow compared with those in other matrices. The sharp linewidths allow us to resolve fine spectral structures originated in subtle interactions between guest and host molecules. In this talk, I will describe how the splitting and lineshape of high-resolution spectra of molecules in quantum hosts give us new information on the static and dynamical interactions of molecules in quantum medium. The topics include dynamical response of superfluid environment upon rotational excitation, and possible superfluid phase of para-H_2 clusters. I will also

  6. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    Science.gov (United States)

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  7. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  8. Identification of Rift Valley fever virus nucleocapsid protein-RNA binding inhibitors using a high-throughput screening assay.

    Science.gov (United States)

    Ellenbecker, Mary; Lanchy, Jean-Marc; Lodmell, J Stephen

    2012-09-01

    Rift Valley fever virus (RVFV) is an emerging infectious pathogen that causes severe disease in humans and livestock and has the potential for global spread. Currently, there is no proven effective treatment for RVFV infection, and there is no licensed vaccine. Inhibition of RNA binding to the essential viral nucleocapsid (N) protein represents a potential antiviral therapeutic strategy because all of the functions performed by N during infection involve RNA binding. To target this interaction, we developed a fluorescence polarization-based high-throughput drug-screening assay and tested 26 424 chemical compounds for their ability to disrupt an N-RNA complex. From libraries of Food and Drug Administration-approved drugs, druglike molecules, and natural product extracts, we identified several lead compounds that are promising candidates for medicinal chemistry.

  9. High-throughput exploration of thermoelectric and mechanical properties of amorphous NbO_2 with transition metal additions

    International Nuclear Information System (INIS)

    Music, Denis; Geyer, Richard W.; Hans, Marcus

    2016-01-01

    To increase the thermoelectric efficiency and reduce the thermal fatigue upon cyclic heat loading, alloying of amorphous NbO_2 with all 3d and 5d transition metals has systematically been investigated using density functional theory. It was found that Ta fulfills the key design criteria, namely, enhancement of the Seebeck coefficient and positive Cauchy pressure (ductility gauge). These quantum mechanical predictions were validated by assessing the thermoelectric and elastic properties on combinatorial thin films, which is a high-throughput approach. The maximum power factor is 2813 μW m"−"1 K"−"2 for the Ta/Nb ratio of 0.25, which is a hundredfold increment compared to pure NbO_2 and exceeds many oxide thermoelectrics. Based on the elasticity measurements, the consistency between theory and experiment for the Cauchy pressure was attained within 2%. On the basis of the electronic structure analysis, these configurations can be perceived as metallic, which is consistent with low electrical resistivity and ductile behavior. Furthermore, a pronounced quantum confinement effect occurs, which is identified as the physical origin for the Seebeck coefficient enhancement.

  10. High-throughput full-automatic synchrotron-based tomographic microscopy

    International Nuclear Information System (INIS)

    Mader, Kevin; Marone, Federica; Hintermueller, Christoph; Mikuljan, Gordan; Isenegger, Andreas; Stampanoni, Marco

    2011-01-01

    At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline of the Swiss Light Source with an energy range of 8-45 keV and voxel size from 0.37 (micro)m to 7.4 (micro)m, full tomographic datasets are typically acquired in 5 to 10 min. To exploit the speed of the system and enable high-throughput studies to be performed in a fully automatic manner, a package of automation tools has been developed. The samples are automatically exchanged, aligned, moved to the correct region of interest, and scanned. This task is accomplished through the coordination of Python scripts, a robot-based sample-exchange system, sample positioning motors and a CCD camera. The tools are suited for any samples that can be mounted on a standard SEM stub, and require no specific environmental conditions. Up to 60 samples can be analyzed at a time without user intervention. The throughput of the system is dependent on resolution, energy and sample size, but rates of four samples per hour have been achieved with 0.74 (micro)m voxel size at 17.5 keV. The maximum intervention-free scanning time is theoretically unlimited, and in practice experiments have been running unattended as long as 53 h (the average beam time allocation at TOMCAT is 48 h per user). The system is the first fully automated high-throughput tomography station: mounting samples, finding regions of interest, scanning and reconstructing can be performed without user intervention. The system also includes many features which accelerate and simplify the process of tomographic microscopy.

  11. High-throughput characterization for solar fuels materials discovery

    Science.gov (United States)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  12. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  13. High-Throughput Screening by Nuclear Magnetic Resonance (HTS by NMR) for the Identification of PPIs Antagonists.

    Science.gov (United States)

    Wu, Bainan; Barile, Elisa; De, Surya K; Wei, Jun; Purves, Angela; Pellecchia, Maurizio

    2015-01-01

    In recent years the ever so complex field of drug discovery has embraced novel design strategies based on biophysical fragment screening (fragment-based drug design; FBDD) using nuclear magnetic resonance spectroscopy (NMR) and/or structure-guided approaches, most often using X-ray crystallography and computer modeling. Experience from recent years unveiled that these methods are more effective and less prone to artifacts compared to biochemical high-throughput screening (HTS) of large collection of compounds in designing protein inhibitors. Hence these strategies are increasingly becoming the most utilized in the modern pharmaceutical industry. Nonetheless, there is still an impending need to develop innovative and effective strategies to tackle other more challenging targets such as those involving protein-protein interactions (PPIs). While HTS strategies notoriously fail to identify viable hits against such targets, few successful examples of PPIs antagonists derived by FBDD strategies exist. Recently, we reported on a new strategy that combines some of the basic principles of fragment-based screening with combinatorial chemistry and NMR-based screening. The approach, termed HTS by NMR, combines the advantages of combinatorial chemistry and NMR-based screening to rapidly and unambiguously identify bona fide inhibitors of PPIs. This review will reiterate the critical aspects of the approach with examples of possible applications.

  14. The surface chemistry determines the spatio-temporal interaction dynamics of quantum dots in atherosclerotic lesions.

    Science.gov (United States)

    Uhl, Bernd; Hirn, Stephanie; Mildner, Karina; Coletti, Raffaele; Massberg, Steffen; Reichel, Christoph A; Rehberg, Markus; Zeuschner, Dagmar; Krombach, Fritz

    2018-03-01

    To optimize the design of nanoparticles for diagnosis or therapy of vascular diseases, it is mandatory to characterize the determinants of nano-bio interactions in vascular lesions. Using ex vivo and in vivo microscopy, we analyzed the interactive behavior of quantum dots with different surface functionalizations in atherosclerotic lesions of ApoE-deficient mice. We demonstrate that quantum dots with different surface functionalizations exhibit specific interactive behaviors with distinct molecular and cellular components of the injured vessel wall. Moreover, we show a role for fibrinogen in the regulation of the spatio-temporal interaction dynamics in atherosclerotic lesions. Our findings emphasize the relevance of surface chemistry-driven nano-bio interactions on the differential in vivo behavior of nanoparticles in diseased tissue.

  15. High-throughput electrical characterization for robust overlay lithography control

    Science.gov (United States)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  16. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  17. Heterogenous phase as a mean in combinatorial chemistry

    International Nuclear Information System (INIS)

    Abdel-Hamid, S.G.

    2007-01-01

    Combinatorial chemistry is a rapid and inexpensive technique for the synthesis of hundreds of thousands of organic compounds of potential medicinal activity. In the past few decades a large number of combinatorial libraries have been constructed, and significantly supplement the chemical diversity of the traditional collections of the potentially active medicinal compounds. Solid phase synthesis was used to enrich the combinatorial chemistry libraries, through the use of solid supports (resins) and their modified forms. Most of the new libraries of compounds appeared recently, were synthesized by the use of solid-phase. Solid-phase combinatorial chemistry (SPCC) is now considered as an outstanding branch in pharmaceutical chemistry research and used extensively as a tool for drug discovery within the context of high-throughput chemical synthesis. The best pure libraries synthesized by the use of solid phase combinatorial chemistry (SPCC) may well be those of intermediate complexity that are free of artifact-causing nuisance compounds. (author)

  18. High energy chemistry. Modern state and trends in development

    International Nuclear Information System (INIS)

    Pikaev, A.K.

    1990-01-01

    In the review modern state of studies in the field of high energy chemistry is considered. The most important achievements and problems of further development of radiation chemistry, plasmochemistry, photochemistry, laser chemistry and some other branches of high energy chemistry are discussed

  19. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  20. A gist of comprehensive review of hadronic chemistry and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Tangde, Vijay M. [Post Graduate Teaching Department of Chemistry, Rashtrasant Tukadoji Maharaj Nagpur University, Amravati Road Campus, NAGPUR - 440 033, India, Email: vijaytn6@gmail.com (India)

    2015-03-10

    20{sup th} century theories of Quantum Mechanics and Quantum Chemistry are exactly valid only when considered to represent the atomic structures. While considering the more general aspects of atomic combinations these theories fail to explain all the related experimental data from first unadulterated axiomatic principles. According to Quantum Chemistry two valence electrons should repel each other and as such there is no mathematical representation of a strong attractive forces between such valence electrons. In view of these and other insufficiencies of Quantum Chemistry, an Italian-American Scientist Professor Ruggero Maria Santilli during his more than five decades of dedicated and sustained research has denounced the fact that quantum chemistry is mostly based on mere nomenclatures. Professor R M Santilli first formulated the iso-, geno- and hyper- mathematics [1, 2, 3, 4] that helped in understanding numerous diversified problems and removing inadequacies in most of the established and celebrated theories of 20th century physics and chemistry. This involves the isotopic, genotopic, etc. lifting of Lie algebra that generated Lie admissible mathematics to properly describe irreversible processes. The studies on Hadronic Mechanics in general and chemistry in particular based on Santilli’s mathematics[3, 4, 5] for the first time has removed the very fundamental limitations of quantum chemistry [2, 6, 7, 8]. In the present discussion, a comprehensive review of Hadronic Chemistry is presented that imparts the completeness to the Quantum Chemistry via an addition of effects at distances of the order of 1 fm (only) which are assumed to be Non-linear, Non-local, Non-potential, Non-hamiltonian and thus Non-unitary, stepwise successes of Hadronic Chemistry and its application in development of a new chemical species called Magnecules.

  1. A gist of comprehensive review of hadronic chemistry and its applications

    International Nuclear Information System (INIS)

    Tangde, Vijay M.

    2015-01-01

    20 th century theories of Quantum Mechanics and Quantum Chemistry are exactly valid only when considered to represent the atomic structures. While considering the more general aspects of atomic combinations these theories fail to explain all the related experimental data from first unadulterated axiomatic principles. According to Quantum Chemistry two valence electrons should repel each other and as such there is no mathematical representation of a strong attractive forces between such valence electrons. In view of these and other insufficiencies of Quantum Chemistry, an Italian-American Scientist Professor Ruggero Maria Santilli during his more than five decades of dedicated and sustained research has denounced the fact that quantum chemistry is mostly based on mere nomenclatures. Professor R M Santilli first formulated the iso-, geno- and hyper- mathematics [1, 2, 3, 4] that helped in understanding numerous diversified problems and removing inadequacies in most of the established and celebrated theories of 20th century physics and chemistry. This involves the isotopic, genotopic, etc. lifting of Lie algebra that generated Lie admissible mathematics to properly describe irreversible processes. The studies on Hadronic Mechanics in general and chemistry in particular based on Santilli’s mathematics[3, 4, 5] for the first time has removed the very fundamental limitations of quantum chemistry [2, 6, 7, 8]. In the present discussion, a comprehensive review of Hadronic Chemistry is presented that imparts the completeness to the Quantum Chemistry via an addition of effects at distances of the order of 1 fm (only) which are assumed to be Non-linear, Non-local, Non-potential, Non-hamiltonian and thus Non-unitary, stepwise successes of Hadronic Chemistry and its application in development of a new chemical species called Magnecules

  2. A high throughput array microscope for the mechanical characterization of biomaterials

    Science.gov (United States)

    Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard

    2015-02-01

    In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.

  3. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses

  4. Technetium Chemistry in High-Level Waste

    International Nuclear Information System (INIS)

    Hess, Nancy J.

    2006-01-01

    Tc contamination is found within the DOE complex at those sites whose mission involved extraction of plutonium from irradiated uranium fuel or isotopic enrichment of uranium. At the Hanford Site, chemical separations and extraction processes generated large amounts of high level and transuranic wastes that are currently stored in underground tanks. The waste from these extraction processes is currently stored in underground High Level Waste (HLW) tanks. However, the chemistry of the HLW in any given tank is greatly complicated by repeated efforts to reduce volume and recover isotopes. These processes ultimately resulted in mixing of waste streams from different processes. As a result, the chemistry and the fate of Tc in HLW tanks are not well understood. This lack of understanding has been made evident in the failed efforts to leach Tc from sludge and to remove Tc from supernatants prior to immobilization. Although recent interest in Tc chemistry has shifted from pretreatment chemistry to waste residuals, both needs are served by a fundamental understanding of Tc chemistry

  5. High-throughput preparation and testing of ion-exchanged zeolites

    International Nuclear Information System (INIS)

    Janssen, K.P.F.; Paul, J.S.; Sels, B.F.; Jacobs, P.A.

    2007-01-01

    A high-throughput research platform was developed for the preparation and subsequent catalytic liquid-phase screening of ion-exchanged zeolites, for instance with regard to their use as heterogeneous catalysts. In this system aqueous solutions and other liquid as well as solid reagents are employed as starting materials and 24 samples are prepared on a library plate with a 4 x 6 layout. Volumetric dispensing of metal precursor solutions, weighing of zeolite and subsequent mixing/washing cycles of the starting materials and distributing reaction mixtures to the library plate are automatically performed by liquid and solid handlers controlled by a single common and easy-to-use programming software interface. The thus prepared materials are automatically contacted with reagent solutions, heated, stirred and sampled continuously using a modified liquid handling. The high-throughput platform is highly promising in enhancing synthesis of catalysts and their screening. In this paper the preparation of lanthanum-exchanged NaY zeolites (LaNaY) on the platform is reported, along with their use as catalyst for the conversion of renewables

  6. A Self-Reporting Photocatalyst for Online Fluorescence Monitoring of High Throughput RAFT Polymerization.

    Science.gov (United States)

    Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie

    2018-04-25

    Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    Science.gov (United States)

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  8. High-throughput platform assay technology for the discovery of pre-microrna-selective small molecule probes.

    Science.gov (United States)

    Lorenz, Daniel A; Song, James M; Garner, Amanda L

    2015-01-21

    MicroRNAs (miRNA) play critical roles in human development and disease. As such, the targeting of miRNAs is considered attractive as a novel therapeutic strategy. A major bottleneck toward this goal, however, has been the identification of small molecule probes that are specific for select RNAs and methods that will facilitate such discovery efforts. Using pre-microRNAs as proof-of-concept, herein we report a conceptually new and innovative approach for assaying RNA-small molecule interactions. Through this platform assay technology, which we term catalytic enzyme-linked click chemistry assay or cat-ELCCA, we have designed a method that can be implemented in high throughput, is virtually free of false readouts, and is general for all nucleic acids. Through cat-ELCCA, we envision the discovery of selective small molecule ligands for disease-relevant miRNAs to promote the field of RNA-targeted drug discovery and further our understanding of the role of miRNAs in cellular biology.

  9. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  10. Multireference quantum chemistry through a joint density matrix renormalization group and canonical transformation theory.

    Science.gov (United States)

    Yanai, Takeshi; Kurashige, Yuki; Neuscamman, Eric; Chan, Garnet Kin-Lic

    2010-01-14

    We describe the joint application of the density matrix renormalization group and canonical transformation theory to multireference quantum chemistry. The density matrix renormalization group provides the ability to describe static correlation in large active spaces, while the canonical transformation theory provides a high-order description of the dynamic correlation effects. We demonstrate the joint theory in two benchmark systems designed to test the dynamic and static correlation capabilities of the methods, namely, (i) total correlation energies in long polyenes and (ii) the isomerization curve of the [Cu(2)O(2)](2+) core. The largest complete active spaces and atomic orbital basis sets treated by the joint DMRG-CT theory in these systems correspond to a (24e,24o) active space and 268 atomic orbitals in the polyenes and a (28e,32o) active space and 278 atomic orbitals in [Cu(2)O(2)](2+).

  11. High-speed quantum networking by ship

    Science.gov (United States)

    Devitt, Simon J.; Greentree, Andrew D.; Stephens, Ashley M.; van Meter, Rodney

    2016-11-01

    Networked entanglement is an essential component for a plethora of quantum computation and communication protocols. Direct transmission of quantum signals over long distances is prevented by fibre attenuation and the no-cloning theorem, motivating the development of quantum repeaters, designed to purify entanglement, extending its range. Quantum repeaters have been demonstrated over short distances, but error-corrected, global repeater networks with high bandwidth require new technology. Here we show that error corrected quantum memories installed in cargo containers and carried by ship can provide a exible connection between local networks, enabling low-latency, high-fidelity quantum communication across global distances at higher bandwidths than previously proposed. With demonstrations of technology with sufficient fidelity to enable topological error-correction, implementation of the quantum memories is within reach, and bandwidth increases with improvements in fabrication. Our approach to quantum networking avoids technological restrictions of repeater deployment, providing an alternate path to a worldwide Quantum Internet.

  12. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  13. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  14. A gas trapping method for high-throughput metabolic experiments.

    Science.gov (United States)

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  15. High-throughput technology for novel SO2 oxidation catalysts

    International Nuclear Information System (INIS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO 2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO 2 to SO 3 . High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO 2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO 2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO 3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. (topical review)

  16. High throughput nanoimprint lithography for semiconductor memory applications

    Science.gov (United States)

    Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun

    2017-03-01

    Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non

  17. Molecular quantum dynamics. From theory to applications

    International Nuclear Information System (INIS)

    Gatti, Fabien

    2014-01-01

    An educational and accessible introduction to the field of molecular quantum dynamics. Illustrates the importance of the topic for broad areas of science: from astrophysics and the physics of the atmosphere, over elementary processes in chemistry, to biological processes. Presents chosen examples of striking applications, highlighting success stories, summarized by the internationally renowned experts. Including a foreword by Lorenz Cederbaum (University Heidelberg, Germany). This book focuses on current applications of molecular quantum dynamics. Examples from all main subjects in the field, presented by the internationally renowned experts, illustrate the importance of the domain. Recent success in helping to understand experimental observations in fields like heterogeneous catalysis, photochemistry, reactive scattering, optical spectroscopy, or femto- and attosecond chemistry and spectroscopy underline that nuclear quantum mechanical effects affect many areas of chemical and physical research. In contrast to standard quantum chemistry calculations, where the nuclei are treated classically, molecular quantum dynamics can cover quantum mechanical effects in their motion. Many examples, ranging from fundamental to applied problems, are known today that are impacted by nuclear quantum mechanical effects, including phenomena like tunneling, zero point energy effects, or non-adiabatic transitions. Being important to correctly understand many observations in chemical, organic and biological systems, or for the understanding of molecular spectroscopy, the range of applications covered in this book comprises broad areas of science: from astrophysics and the physics and chemistry of the atmosphere, over elementary processes in chemistry, to biological processes (such as the first steps of photosynthesis or vision). Nevertheless, many researchers refrain from entering this domain. The book ''Molecular Quantum Dynamics'' offers them an accessible introduction. Although the

  18. Molecular quantum dynamics. From theory to applications

    Energy Technology Data Exchange (ETDEWEB)

    Gatti, Fabien (ed.) [Montpellier 2 Univ. (France). Inst. Charles Gerhardt - CNRS 5253

    2014-09-01

    An educational and accessible introduction to the field of molecular quantum dynamics. Illustrates the importance of the topic for broad areas of science: from astrophysics and the physics of the atmosphere, over elementary processes in chemistry, to biological processes. Presents chosen examples of striking applications, highlighting success stories, summarized by the internationally renowned experts. Including a foreword by Lorenz Cederbaum (University Heidelberg, Germany). This book focuses on current applications of molecular quantum dynamics. Examples from all main subjects in the field, presented by the internationally renowned experts, illustrate the importance of the domain. Recent success in helping to understand experimental observations in fields like heterogeneous catalysis, photochemistry, reactive scattering, optical spectroscopy, or femto- and attosecond chemistry and spectroscopy underline that nuclear quantum mechanical effects affect many areas of chemical and physical research. In contrast to standard quantum chemistry calculations, where the nuclei are treated classically, molecular quantum dynamics can cover quantum mechanical effects in their motion. Many examples, ranging from fundamental to applied problems, are known today that are impacted by nuclear quantum mechanical effects, including phenomena like tunneling, zero point energy effects, or non-adiabatic transitions. Being important to correctly understand many observations in chemical, organic and biological systems, or for the understanding of molecular spectroscopy, the range of applications covered in this book comprises broad areas of science: from astrophysics and the physics and chemistry of the atmosphere, over elementary processes in chemistry, to biological processes (such as the first steps of photosynthesis or vision). Nevertheless, many researchers refrain from entering this domain. The book ''Molecular Quantum Dynamics'' offers them an accessible

  19. High-throughput exploration of thermoelectric and mechanical properties of amorphous NbO{sub 2} with transition metal additions

    Energy Technology Data Exchange (ETDEWEB)

    Music, Denis, E-mail: music@mch.rwth-aachen.de; Geyer, Richard W.; Hans, Marcus [Materials Chemistry, RWTH Aachen University, Kopernikusstr. 10, 52074 Aachen (Germany)

    2016-07-28

    To increase the thermoelectric efficiency and reduce the thermal fatigue upon cyclic heat loading, alloying of amorphous NbO{sub 2} with all 3d and 5d transition metals has systematically been investigated using density functional theory. It was found that Ta fulfills the key design criteria, namely, enhancement of the Seebeck coefficient and positive Cauchy pressure (ductility gauge). These quantum mechanical predictions were validated by assessing the thermoelectric and elastic properties on combinatorial thin films, which is a high-throughput approach. The maximum power factor is 2813 μW m{sup −1} K{sup −2} for the Ta/Nb ratio of 0.25, which is a hundredfold increment compared to pure NbO{sub 2} and exceeds many oxide thermoelectrics. Based on the elasticity measurements, the consistency between theory and experiment for the Cauchy pressure was attained within 2%. On the basis of the electronic structure analysis, these configurations can be perceived as metallic, which is consistent with low electrical resistivity and ductile behavior. Furthermore, a pronounced quantum confinement effect occurs, which is identified as the physical origin for the Seebeck coefficient enhancement.

  20. High-throughput screening of carbohydrate-degrading enzymes using novel insoluble chromogenic substrate assay kits

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Willats, William George Tycho

    2016-01-01

    for this is that advances in genome and transcriptome sequencing, together with associated bioinformatics tools allow for rapid identification of candidate CAZymes, but technology for determining an enzyme's biochemical characteristics has advanced more slowly. To address this technology gap, a novel high-throughput assay...... CPH and ICB substrates are provided in a 96-well high-throughput assay system. The CPH substrates can be made in four different colors, enabling them to be mixed together and thus increasing assay throughput. The protocol describes a 96-well plate assay and illustrates how this assay can be used...... for screening the activities of enzymes, enzyme cocktails, and broths....

  1. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  2. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  3. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    Science.gov (United States)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  4. The application of the high throughput sequencing technology in the transposable elements.

    Science.gov (United States)

    Liu, Zhen; Xu, Jian-hong

    2015-09-01

    High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.

  5. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    Science.gov (United States)

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  6. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    Science.gov (United States)

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  7. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  8. Advances in high temperature water chemistry and future issues

    International Nuclear Information System (INIS)

    Millett, P.J.

    2005-01-01

    This paper traces the development of advances in high temperature water chemistry with emphasis in the field of nuclear power. Many of the water chemistry technologies used in plants throughout the world today would not have been possible without the underlying scientific advances made in this field. In recent years, optimization of water chemistry has been accomplished by the availability of high temperature water chemistry codes such as MULTEQ. These tools have made the science of high temperature chemistry readily accessible for engineering purposes. The paper closes with a discussion of what additional scientific data and insights must be pursued in order to support the further development of water chemistry technologies for the nuclear industry. (orig.)

  9. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  10. High-throughput purification of recombinant proteins using self-cleaving intein tags.

    Science.gov (United States)

    Coolbaugh, M J; Shakalli Tang, M J; Wood, D W

    2017-01-01

    High throughput methods for recombinant protein production using E. coli typically involve the use of affinity tags for simple purification of the protein of interest. One drawback of these techniques is the occasional need for tag removal before study, which can be hard to predict. In this work, we demonstrate two high throughput purification methods for untagged protein targets based on simple and cost-effective self-cleaving intein tags. Two model proteins, E. coli beta-galactosidase (βGal) and superfolder green fluorescent protein (sfGFP), were purified using self-cleaving versions of the conventional chitin-binding domain (CBD) affinity tag and the nonchromatographic elastin-like-polypeptide (ELP) precipitation tag in a 96-well filter plate format. Initial tests with shake flask cultures confirmed that the intein purification scheme could be scaled down, with >90% pure product generated in a single step using both methods. The scheme was then validated in a high throughput expression platform using 24-well plate cultures followed by purification in 96-well plates. For both tags and with both target proteins, the purified product was consistently obtained in a single-step, with low well-to-well and plate-to-plate variability. This simple method thus allows the reproducible production of highly pure untagged recombinant proteins in a convenient microtiter plate format. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Coupled effects of solution chemistry and hydrodynamics on the mobility and transport of quantum dot nanomaterials in the Vadose Zone

    Science.gov (United States)

    To investigate the coupled effects of solution chemistry and vadose zone processes on the mobility of quantum dot (QD) nanoparticles, laboratory scale transport experiments were performed. The complex coupled effects of ionic strength, size of QD aggregates, surface tension, contact angle, infiltrat...

  12. FPS scientific and supercomputers computers in chemistry

    International Nuclear Information System (INIS)

    Curington, I.J.

    1987-01-01

    FPS Array Processors, scientific computers, and highly parallel supercomputers are used in nearly all aspects of compute-intensive computational chemistry. A survey is made of work utilizing this equipment, both published and current research. The relationship of the computer architecture to computational chemistry is discussed, with specific reference to Molecular Dynamics, Quantum Monte Carlo simulations, and Molecular Graphics applications. Recent installations of the FPS T-Series are highlighted, and examples of Molecular Graphics programs running on the FPS-5000 are shown

  13. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  15. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  16. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  17. High resolution STEM of quantum dots and quantum wires

    DEFF Research Database (Denmark)

    Kadkhodazadeh, Shima

    2013-01-01

    This article reviews the application of high resolution scanning transmission electron microscopy (STEM) to semiconductor quantum dots (QDs) and quantum wires (QWRs). Different imaging and analytical techniques in STEM are introduced and key examples of their application to QDs and QWRs...

  18. Simultaneous measurements of auto-immune and infectious disease specific antibodies using a high throughput multiplexing tool.

    Directory of Open Access Journals (Sweden)

    Atul Asati

    Full Text Available Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders.

  19. Solved and unsolved problems in relativistic quantum chemistry

    International Nuclear Information System (INIS)

    Kutzelnigg, Werner

    2012-01-01

    Graphical abstract: The graphical abstract represents the Dirac-Coulomb Hamiltonian in Fock space in a diagrammatic notation. A line (vertical or slanted) with an upgoing arrow represents an eletron, with a downgoing arrow a positron. A cross in the first line means the potential created by a nucleus, a broken line represents the Coulomb interaction between electrons and positrons. Highlights: ► Relativistic many-electron theory needs a Fock space and a field-dependent vacuum. ► A good starting point is QED in Coulomb gauge without transversal photons. ► The Dirac underworld picture is obsolete. ► A kinetically balanced even-tempered Gaussian basis is complete. ► ‘Quantum chemistry in Fock space is preferable over QED. - Abstract: A hierarchy of approximations in relativistic many-electron theory is discussed that starts with the Dirac equation and its expansion in a kinetically balanced basis, via a formulation of non-interacting electrons in Fock space (which is the only consistent way to deal with negative-energy states). The most straightforward approximate Hamiltonian for interacting electrons is derived from quantum electrodynamics (QED) in Coulomb gauge with the neglect of transversal photons. This allows an exact (non-perturbative) decoupling of the electromagnetic field from the fermionic field. The electric interaction of the fermions is non-retarded and non-quantized. The quantization of the fermionic field leads to a polarizable vacuum. The simplest (but somewhat problematic) approximation is a no-pair projected theory with external-field projectors. The Dirac-Coulomb operator in configuration space (first quantization) is not acceptable, even if the Brown–Ravenhall disease is much less virulent than often claimed. Effects of transversal photons, such as the Breit interaction and renormalized self-interaction can be taken care of perturbatively at the end, but there are still many open questions.

  20. Modular high-throughput test stand for versatile screening of thin-film materials libraries

    International Nuclear Information System (INIS)

    Thienhaus, Sigurd; Hamann, Sven; Ludwig, Alfred

    2011-01-01

    Versatile high-throughput characterization tools are required for the development of new materials using combinatorial techniques. Here, we describe a modular, high-throughput test stand for the screening of thin-film materials libraries, which can carry out automated electrical, magnetic and magnetoresistance measurements in the temperature range of −40 to 300 °C. As a proof of concept, we measured the temperature-dependent resistance of Fe–Pd–Mn ferromagnetic shape-memory alloy materials libraries, revealing reversible martensitic transformations and the associated transformation temperatures. Magneto-optical screening measurements of a materials library identify ferromagnetic samples, whereas resistivity maps support the discovery of new phases. A distance sensor in the same setup allows stress measurements in materials libraries deposited on cantilever arrays. A combination of these methods offers a fast and reliable high-throughput characterization technology for searching for new materials. Using this approach, a composition region has been identified in the Fe–Pd–Mn system that combines ferromagnetism and martensitic transformation.

  1. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  2. comparative assessment of university chemistry undergraduate

    African Journals Online (AJOL)

    Temechegn

    The areas of chemistry covered are Introductory, Inorganic, Physical, Organic, and Quantum and ... various specialisations like Pure and Applied Chemistry, Analytical ... even engineering disciplines, a degree in chemistry can be the starting point. .... It is also to show the relevance of the instructional methods relative to the.

  3. High-resolution and high-throughput multichannel Fourier transform spectrometer with two-dimensional interferogram warping compensation

    Science.gov (United States)

    Watanabe, A.; Furukawa, H.

    2018-04-01

    The resolution of multichannel Fourier transform (McFT) spectroscopy is insufficient for many applications despite its extreme advantage of high throughput. We propose an improved configuration to realise both performance using a two-dimensional area sensor. For the spectral resolution, we obtained the interferogram of a larger optical path difference by shifting the area sensor without altering any optical components. The non-linear phase error of the interferometer was successfully corrected using a phase-compensation calculation. Warping compensation was also applied to realise a higher throughput to accumulate the signal between vertical pixels. Our approach significantly improved the resolution and signal-to-noise ratio by factors of 1.7 and 34, respectively. This high-resolution and high-sensitivity McFT spectrometer will be useful for detecting weak light signals such as those in non-invasive diagnosis.

  4. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  5. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  6. A comparison of high-throughput techniques for assaying circadian rhythms in plants.

    Science.gov (United States)

    Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony

    2015-01-01

    Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.

  7. Enabling tools for high-throughput detection of metabolites: Metabolic engineering and directed evolution applications.

    Science.gov (United States)

    Lin, Jyun-Liang; Wagner, James M; Alper, Hal S

    2017-12-01

    Within the Design-Build-Test Cycle for strain engineering, rapid product detection and selection strategies remain challenging and limit overall throughput. Here we summarize a wide variety of modalities that transduce chemical concentrations into easily measured absorbance, luminescence, and fluorescence signals. Specifically, we cover protein-based biosensors (including transcription factors), nucleic acid-based biosensors, coupled enzyme reactions, bioorthogonal chemistry, and fluorescent and chromogenic dyes and substrates as modalities for detection. We focus on the use of these methods for strain engineering and enzyme discovery and conclude with remarks on the current and future state of biosensor development for application in the metabolic engineering field. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  9. The JCSG high-throughput structural biology pipeline

    International Nuclear Information System (INIS)

    Elsliger, Marc-André; Deacon, Ashley M.; Godzik, Adam; Lesley, Scott A.; Wooley, John; Wüthrich, Kurt; Wilson, Ian A.

    2010-01-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years and has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe. The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications

  10. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  11. High-throughput selection for cellulase catalysts using chemical complementation.

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  12. Blood group genotyping: from patient to high-throughput donor screening.

    Science.gov (United States)

    Veldhuisen, B; van der Schoot, C E; de Haas, M

    2009-10-01

    Blood group antigens, present on the cell membrane of red blood cells and platelets, can be defined either serologically or predicted based on the genotypes of genes encoding for blood group antigens. At present, the molecular basis of many antigens of the 30 blood group systems and 17 human platelet antigens is known. In many laboratories, blood group genotyping assays are routinely used for diagnostics in cases where patient red cells cannot be used for serological typing due to the presence of auto-antibodies or after recent transfusions. In addition, DNA genotyping is used to support (un)-expected serological findings. Fetal genotyping is routinely performed when there is a risk of alloimmune-mediated red cell or platelet destruction. In case of patient blood group antigen typing, it is important that a genotyping result is quickly available to support the selection of donor blood, and high-throughput of the genotyping method is not a prerequisite. In addition, genotyping of blood donors will be extremely useful to obtain donor blood with rare phenotypes, for example lacking a high-frequency antigen, and to obtain a fully typed donor database to be used for a better matching between recipient and donor to prevent adverse transfusion reactions. Serological typing of large cohorts of donors is a labour-intensive and expensive exercise and hampered by the lack of sufficient amounts of approved typing reagents for all blood group systems of interest. Currently, high-throughput genotyping based on DNA micro-arrays is a very feasible method to obtain a large pool of well-typed blood donors. Several systems for high-throughput blood group genotyping are developed and will be discussed in this review.

  13. High-throughput assessment of context-dependent effects of chromatin proteins

    NARCIS (Netherlands)

    Brueckner, L. (Laura); Van Arensbergen, J. (Joris); Akhtar, W. (Waseem); L. Pagie (Ludo); B. van Steensel (Bas)

    2016-01-01

    textabstractBackground: Chromatin proteins control gene activity in a concerted manner. We developed a high-throughput assay to study the effects of the local chromatin environment on the regulatory activity of a protein of interest. The assay combines a previously reported multiplexing strategy

  14. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  15. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    Science.gov (United States)

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...

  16. Combinatorial chemoenzymatic synthesis and high-throughput screening of sialosides.

    Science.gov (United States)

    Chokhawala, Harshal A; Huang, Shengshu; Lau, Kam; Yu, Hai; Cheng, Jiansong; Thon, Vireak; Hurtado-Ziola, Nancy; Guerrero, Juan A; Varki, Ajit; Chen, Xi

    2008-09-19

    Although the vital roles of structures containing sialic acid in biomolecular recognition are well documented, limited information is available on how sialic acid structural modifications, sialyl linkages, and the underlying glycan structures affect the binding or the activity of sialic acid-recognizing proteins and related downstream biological processes. A novel combinatorial chemoenzymatic method has been developed for the highly efficient synthesis of biotinylated sialosides containing different sialic acid structures and different underlying glycans in 96-well plates from biotinylated sialyltransferase acceptors and sialic acid precursors. By transferring the reaction mixtures to NeutrAvidin-coated plates and assaying for the yields of enzymatic reactions using lectins recognizing sialyltransferase acceptors but not the sialylated products, the biotinylated sialoside products can be directly used, without purification, for high-throughput screening to quickly identify the ligand specificity of sialic acid-binding proteins. For a proof-of-principle experiment, 72 biotinylated alpha2,6-linked sialosides were synthesized in 96-well plates from 4 biotinylated sialyltransferase acceptors and 18 sialic acid precursors using a one-pot three-enzyme system. High-throughput screening assays performed in NeutrAvidin-coated microtiter plates show that whereas Sambucus nigra Lectin binds to alpha2,6-linked sialosides with high promiscuity, human Siglec-2 (CD22) is highly selective for a number of sialic acid structures and the underlying glycans in its sialoside ligands.

  17. A high-throughput surface plasmon resonance biosensor based on differential interferometric imaging

    International Nuclear Information System (INIS)

    Wang, Daqian; Ding, Lili; Zhang, Wei; Zhang, Enyao; Yu, Xinglong; Luo, Zhaofeng; Ou, Huichao

    2012-01-01

    A new high-throughput surface plasmon resonance (SPR) biosensor based on differential interferometric imaging is reported. The two SPR interferograms of the sensing surface are imaged on two CCD cameras. The phase difference between the two interferograms is 180°. The refractive index related factor (RIRF) of the sensing surface is calculated from the two simultaneously acquired interferograms. The simulation results indicate that the RIRF exhibits a linear relationship with the refractive index of the sensing surface and is unaffected by the noise, drift and intensity distribution of the light source. The affinity and kinetic information can be extracted in real time from continuously acquired RIRF distributions. The results of refractometry experiments show that the dynamic detection range of SPR differential interferometric imaging system can be over 0.015 refractive index unit (RIU). High refractive index resolution is down to 0.45 RU (1 RU = 1 × 10 −6 RIU). Imaging and protein microarray experiments demonstrate the ability of high-throughput detection. The aptamer experiments demonstrate that the SPR sensor based on differential interferometric imaging has a great capability to be implemented for high-throughput aptamer kinetic evaluation. These results suggest that this biosensor has the potential to be utilized in proteomics and drug discovery after further improvement. (paper)

  18. Current status and future prospects for enabling chemistry technology in the drug discovery process [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Stevan W. Djuric

    2016-09-01

    Full Text Available This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  19. 40 CFR Table 3 to Subpart Eeee of... - Operating Limits-High Throughput Transfer Racks

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Operating Limits-High Throughput Transfer Racks 3 Table 3 to Subpart EEEE of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION... Throughput Transfer Racks As stated in § 63.2346(e), you must comply with the operating limits for existing...

  20. High-Throughput Non-destructive Phenotyping of Traits that Contribute to Salinity Tolerance in Arabidopsis thaliana

    KAUST Repository

    Awlia, Mariam

    2016-09-28

    Reproducible and efficient high-throughput phenotyping approaches, combined with advances in genome sequencing, are facilitating the discovery of genes affecting plant performance. Salinity tolerance is a desirable trait that can be achieved through breeding, where most have aimed at selecting for plants that perform effective ion exclusion from the shoots. To determine overall plant performance under salt stress, it is helpful to investigate several plant traits collectively in one experimental setup. Hence, we developed a quantitative phenotyping protocol using a high-throughput phenotyping system, with RGB and chlorophyll fluorescence (ChlF) imaging, which captures the growth, morphology, color and photosynthetic performance of Arabidopsis thaliana plants in response to salt stress. We optimized our salt treatment by controlling the soil-water content prior to introducing salt stress. We investigated these traits over time in two accessions in soil at 150, 100, or 50 mM NaCl to find that the plants subjected to 100 mM NaCl showed the most prominent responses in the absence of symptoms of severe stress. In these plants, salt stress induced significant changes in rosette area and morphology, but less prominent changes in rosette coloring and photosystem II efficiency. Clustering of ChlF traits with plant growth of nine accessions maintained at 100 mM NaCl revealed that in the early stage of salt stress, salinity tolerance correlated with non-photochemical quenching processes and during the later stage, plant performance correlated with quantum yield. This integrative approach allows the simultaneous analysis of several phenotypic traits. In combination with various genetic resources, the phenotyping protocol described here is expected to increase our understanding of plant performance and stress responses, ultimately identifying genes that improve plant performance in salt stress conditions.

  1. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  2. High-throughput screening with micro-x-ray fluorescence

    International Nuclear Information System (INIS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-01-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity

  3. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  4. High-Throughput Next-Generation Sequencing of Polioviruses

    Science.gov (United States)

    Montmayeur, Anna M.; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J.; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L.; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A.; Oberste, M. Steven; Burns, Cara C.

    2016-01-01

    ABSTRACT The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance. PMID:27927929

  5. Virtual high screening throughput and design of 14α-lanosterol ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-07-06

    Jul 6, 2009 ... Virtual high screening throughput and design of. 14α-lanosterol demethylase inhibitors against. Mycobacterium tuberculosis. Hildebert B. Maurice1*, Esther Tuarira1 and Kennedy Mwambete2. 1School of Pharmaceutical Sciences, Institute of Allied Health Sciences, Muhimbili University of Health and.

  6. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Dongfang Li

    2015-10-01

    Full Text Available Random number generators (RNG play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST randomness tests and is resilient to a wide range of security attacks.

  7. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    Science.gov (United States)

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  8. Infrared and Raman spectroscopy and quantum chemistry calculation studies of C-H...O hydrogen bondings and thermal behavior of biodegradable polyhydroxyalkanoate

    Czech Academy of Sciences Publication Activity Database

    Sato, H.; Dybal, Jiří; Murakami, R.; Noda, I.; Ozaki, Y.

    744-747, - (2005), s. 35-46 ISSN 0022-2860 R&D Projects: GA AV ČR IAA4050208 Keywords : infrared and Raman spectroscopy * quantum chemical calculation * C-H...O hydrogen bonding Subject RIV: CD - Macromolecular Chemistry Impact factor: 1.440, year: 2005

  9. Introduction to quantum graphs

    CERN Document Server

    Berkolaiko, Gregory

    2012-01-01

    A "quantum graph" is a graph considered as a one-dimensional complex and equipped with a differential operator ("Hamiltonian"). Quantum graphs arise naturally as simplified models in mathematics, physics, chemistry, and engineering when one considers propagation of waves of various nature through a quasi-one-dimensional (e.g., "meso-" or "nano-scale") system that looks like a thin neighborhood of a graph. Works that currently would be classified as discussing quantum graphs have been appearing since at least the 1930s, and since then, quantum graphs techniques have been applied successfully in various areas of mathematical physics, mathematics in general and its applications. One can mention, for instance, dynamical systems theory, control theory, quantum chaos, Anderson localization, microelectronics, photonic crystals, physical chemistry, nano-sciences, superconductivity theory, etc. Quantum graphs present many non-trivial mathematical challenges, which makes them dear to a mathematician's heart. Work on qu...

  10. Technological Innovations for High-Throughput Approaches to In Vitro Allergy Diagnosis.

    Science.gov (United States)

    Chapman, Martin D; Wuenschmann, Sabina; King, Eva; Pomés, Anna

    2015-07-01

    Allergy diagnostics is being transformed by the advent of in vitro IgE testing using purified allergen molecules, combined with multiplex technology and biosensors, to deliver discriminating, sensitive, and high-throughput molecular diagnostics at the point of care. Essential elements of IgE molecular diagnostics are purified natural or recombinant allergens with defined purity and IgE reactivity, planar or bead-based multiplex systems to enable IgE to multiple allergens to be measured simultaneously, and, most recently, nanotechnology-based biosensors that facilitate rapid reaction rates and delivery of test results via mobile devices. Molecular diagnostics relies on measurement of IgE to purified allergens, the "active ingredients" of allergenic extracts. Typically, this involves measuring IgE to multiple allergens which is facilitated by multiplex technology and biosensors. The technology differentiates between clinically significant cross-reactive allergens (which could not be deduced by conventional IgE assays using allergenic extracts) and provides better diagnostic outcomes. Purified allergens are manufactured under good laboratory practice and validated using protein chemistry, mass spectrometry, and IgE antibody binding. Recently, multiple allergens (from dog) were expressed as a single molecule with high diagnostic efficacy. Challenges faced by molecular allergy diagnostic companies include generation of large panels of purified allergens with known diagnostic efficacy, access to flexible and robust array or sensor technology, and, importantly, access to well-defined serum panels form allergic patients for product development and validation. Innovations in IgE molecular diagnostics are rapidly being brought to market and will strengthen allergy testing at the point of care.

  11. High-throughput fractionation of human plasma for fast enrichment of low- and high-abundance proteins.

    Science.gov (United States)

    Breen, Lucas; Cao, Lulu; Eom, Kirsten; Srajer Gajdosik, Martina; Camara, Lila; Giacometti, Jasminka; Dupuy, Damian E; Josic, Djuro

    2012-05-01

    Fast, cost-effective and reproducible isolation of IgM from plasma is invaluable to the study of IgM and subsequent understanding of the human immune system. Additionally, vast amounts of information regarding human physiology and disease can be derived from analysis of the low abundance proteome of the plasma. In this study, methods were optimized for both the high-throughput isolation of IgM from human plasma, and the high-throughput isolation and fractionation of low abundance plasma proteins. To optimize the chromatographic isolation of IgM from human plasma, many variables were examined including chromatography resin, mobile phases, and order of chromatographic separations. Purification of IgM was achieved most successfully through isolation of immunoglobulin from human plasma using Protein A chromatography with a specific resin followed by subsequent fractionation using QA strong anion exchange chromatography. Through these optimization experiments, an additional method was established to prepare plasma for analysis of low abundance proteins. This method involved chromatographic depletion of high-abundance plasma proteins and reduction of plasma proteome complexity through further chromatographic fractionation. Purification of IgM was achieved with high purity as confirmed by SDS-PAGE and IgM-specific immunoblot. Isolation and fractionation of low abundance protein was also performed successfully, as confirmed by SDS-PAGE and mass spectrometry analysis followed by label-free quantitative spectral analysis. The level of purity of the isolated IgM allows for further IgM-specific analysis of plasma samples. The developed fractionation scheme can be used for high throughput screening of human plasma in order to identify low and high abundance proteins as potential prognostic and diagnostic disease biomarkers.

  12. GPU Linear Algebra Libraries and GPGPU Programming for Accelerating MOPAC Semiempirical Quantum Chemistry Calculations.

    Science.gov (United States)

    Maia, Julio Daniel Carvalho; Urquiza Carvalho, Gabriel Aires; Mangueira, Carlos Peixoto; Santana, Sidney Ramos; Cabral, Lucidio Anjos Formiga; Rocha, Gerd B

    2012-09-11

    In this study, we present some modifications in the semiempirical quantum chemistry MOPAC2009 code that accelerate single-point energy calculations (1SCF) of medium-size (up to 2500 atoms) molecular systems using GPU coprocessors and multithreaded shared-memory CPUs. Our modifications consisted of using a combination of highly optimized linear algebra libraries for both CPU (LAPACK and BLAS from Intel MKL) and GPU (MAGMA and CUBLAS) to hasten time-consuming parts of MOPAC such as the pseudodiagonalization, full diagonalization, and density matrix assembling. We have shown that it is possible to obtain large speedups just by using CPU serial linear algebra libraries in the MOPAC code. As a special case, we show a speedup of up to 14 times for a methanol simulation box containing 2400 atoms and 4800 basis functions, with even greater gains in performance when using multithreaded CPUs (2.1 times in relation to the single-threaded CPU code using linear algebra libraries) and GPUs (3.8 times). This degree of acceleration opens new perspectives for modeling larger structures which appear in inorganic chemistry (such as zeolites and MOFs), biochemistry (such as polysaccharides, small proteins, and DNA fragments), and materials science (such as nanotubes and fullerenes). In addition, we believe that this parallel (GPU-GPU) MOPAC code will make it feasible to use semiempirical methods in lengthy molecular simulations using both hybrid QM/MM and QM/QM potentials.

  13. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  14. Self-Assembled BN and BCN Quantum Dots Obtained from High Intensity Ultrasound Exfoliated Nanosheets

    Czech Academy of Sciences Publication Activity Database

    Štengl, Václav; Henych, Jiří; Kormunda, M.

    2014-01-01

    Roč. 6, č. 6 (2014), s. 1106-1116 ISSN 1947-2935 Institutional support: RVO:61388980 Keywords : Ultrasound * Exfoliation * BN * BCN * Quantum Dots Subject RIV: CA - Inorganic Chemistry Impact factor: 2.598, year: 2014

  15. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free...

  16. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  17. Crystal Symmetry Algorithms in a High-Throughput Framework for Materials

    Science.gov (United States)

    Taylor, Richard

    The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.

  18. Development of Control Applications for High-Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yurii A.; Matsugaki, Naohiro; Honda, Nobuo; Sasajima, Kumiko; Igarashi, Noriyuki; Hiraki, Masahiko; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    An integrated client-server control system (PCCS) with a unified relational database (PCDB) has been developed for high-throughput protein crystallography experiments on synchrotron beamlines. The major steps in protein crystallographic experiments (purification, crystallization, crystal harvesting, data collection, and data processing) are integrated into the software. All information necessary for performing protein crystallography experiments is stored in the PCDB database (except raw X-ray diffraction data, which is stored in the Network File Server). To allow all members of a protein crystallography group to participate in experiments, the system was developed as a multi-user system with secure network access based on TCP/IP secure UNIX sockets. Secure remote access to the system is possible from any operating system with X-terminal and SSH/X11 (Secure Shell with graphical user interface) support. Currently, the system covers the high-throughput X-ray data collection stages and is being commissioned at BL5A and NW12A (PF, PF-AR, KEK, Tsukuba, Japan)

  19. Fluorescence-based high-throughput screening of dicer cleavage activity

    Czech Academy of Sciences Publication Activity Database

    Podolská, Kateřina; Sedlák, David; Bartůněk, Petr; Svoboda, Petr

    2014-01-01

    Roč. 19, č. 3 (2014), s. 417-426 ISSN 1087-0571 R&D Projects: GA ČR GA13-29531S; GA MŠk(CZ) LC06077; GA MŠk LM2011022 Grant - others:EMBO(DE) 1483 Institutional support: RVO:68378050 Keywords : Dicer * siRNA * high-throughput screening Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.423, year: 2014

  20. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    . A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct....... The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery....

  1. Novel high-throughput cell-based hybridoma screening methodology using the Celigo Image Cytometer.

    Science.gov (United States)

    Zhang, Haohai; Chan, Leo Li-Ying; Rice, William; Kassam, Nasim; Longhi, Maria Serena; Zhao, Haitao; Robson, Simon C; Gao, Wenda; Wu, Yan

    2017-08-01

    Hybridoma screening is a critical step for antibody discovery, which necessitates prompt identification of potential clones from hundreds to thousands of hybridoma cultures against the desired immunogen. Technical issues associated with ELISA- and flow cytometry-based screening limit accuracy and diminish high-throughput capability, increasing time and cost. Conventional ELISA screening with coated antigen is also impractical for difficult-to-express hydrophobic membrane antigens or multi-chain protein complexes. Here, we demonstrate novel high-throughput screening methodology employing the Celigo Image Cytometer, which avoids nonspecific signals by contrasting antibody binding signals directly on living cells, with and without recombinant antigen expression. The image cytometry-based high-throughput screening method was optimized by detecting the binding of hybridoma supernatants to the recombinant antigen CD39 expressed on Chinese hamster ovary (CHO) cells. Next, the sensitivity of the image cytometer was demonstrated by serial dilution of purified CD39 antibody. Celigo was used to measure antibody affinities of commercial and in-house antibodies to membrane-bound CD39. This cell-based screening procedure can be completely accomplished within one day, significantly improving throughput and efficiency of hybridoma screening. Furthermore, measuring direct antibody binding to living cells eliminated both false positive and false negative hits. The image cytometry method was highly sensitive and versatile, and could detect positive antibody in supernatants at concentrations as low as ~5ng/mL, with concurrent K d binding affinity coefficient determination. We propose that this screening method will greatly facilitate antibody discovery and screening technologies. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Quantum Accelerators for High-performance Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S. [ORNL; Britt, Keith A. [ORNL; Mohiyaddin, Fahd A. [ORNL

    2017-11-01

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, the prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.

  3. High throughput diffractive multi-beam femtosecond laser processing using a spatial light modulator

    Energy Technology Data Exchange (ETDEWEB)

    Kuang Zheng [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom)], E-mail: z.kuang@liv.ac.uk; Perrie, Walter [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom); Leach, Jonathan [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom); Sharp, Martin; Edwardson, Stuart P. [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom); Padgett, Miles [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom); Dearden, Geoff; Watkins, Ken G. [Laser Group, Department of Engineering, University of Liverpool Brownlow Street, Liverpool L69 3GQ (United Kingdom)

    2008-12-30

    High throughput femtosecond laser processing is demonstrated by creating multiple beams using a spatial light modulator (SLM). The diffractive multi-beam patterns are modulated in real time by computer generated holograms (CGHs), which can be calculated by appropriate algorithms. An interactive LabVIEW program is adopted to generate the relevant CGHs. Optical efficiency at this stage is shown to be {approx}50% into first order beams and real time processing has been carried out at 50 Hz refresh rate. Results obtained demonstrate high precision surface micro-structuring on silicon and Ti6Al4V with throughput gain >1 order of magnitude.

  4. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    Science.gov (United States)

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  5. Chemistry of high-energy materials

    Energy Technology Data Exchange (ETDEWEB)

    Klapoetke, Thomas M. [Ludwig-Maximilians-Univ., Muenchen (Germany). Dept. of Chemistry; Maryland Univ., College Park, MD (US). Center of Energetic Concepts Development (CECD)

    2011-07-01

    The graduate-level textbook Chemistry of High-Energy Materials provides an introduction to and an overview of primary and secondary (high) explosives as well as propellant charges, rocket propellants and pyrotechnics. After a brief historical overview, the main classes of energetic materials are discussed systematically. Thermodynamic aspects, as far as relevant to energetic materials, are discussed, as well as modern computational approaches to predict performance and sensitivity parameters. The most important performance criteria such as detonation velocity, detonation pressure and heat of explosion, as well as the relevant sensitivity parameters suc as impact and friction sensitivity and electrostatic discharge sensitivity are explored in detail. Modern aspects of chemical synthesis including lead-free primary explosives and high-nitrogen compounds are also included in this book together with a discussion of high-energy materials for future defense needs. The most important goal of this book, based on a lecture course which has now been held at LMU Munich for over 12 years, is to increase knowledge and know-how in the synthesis and safe handling of high-energy materials. Society needs now as much as ever advanced explosives, propellant charges, rocket propellants and pyrotechnics to meet the demands in defense and engineering. This book is first and foremost aimed at advanced students in chemistry, engineering and materials sciences. However, it is also intended to provide a good introduction to the chemistry of energetic materials and chemical defense technology for scientists in the defense industry and government-run defense organizations. (orig.)

  6. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    Science.gov (United States)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  7. Modeling Disordered Materials with a High Throughput ab-initio Approach

    Science.gov (United States)

    2015-11-13

    Modeling Disordered Materials with a High Throughput ab - initio Approach Kesong Yang,1 Corey Oses,2 and Stefano Curtarolo3, 4 1Department of...J. Furthmüller, Efficient iterative schemes for ab initio total-energy calculations using a plane-wave basis set, Phys. Rev. B 54, 11169–11186 (1996

  8. Sign me up! Determining motivation for high school chemistry students enrolling in a second year chemistry course

    Science.gov (United States)

    Camarena, Nilda N.

    A sample of 108 Pre-AP Chemistry students in Texas participated in a study to determine motivational factors for enrolling in AP Chemistry and University Chemistry. The factors measured were academic attitude, perceptions of chemistry, confidence level in chemistry, and expectations/experiences in the chemistry class. Students completed two questionnaires, one at the beginning of the year and one at the end. Four high school campuses from two school districts in Texas participated. Two campuses were traditional high schools and two were smaller magnet schools. The results from this study are able to confirm that there are definite correlations between academic attitudes, perceptions, confidence level, and experiences and a student's plans to enroll in AP and University Chemistry. The type of school as well as the student's gender seemed to have an influence on a student's plan to enroll in a second year of chemistry.

  9. A high-throughput pipeline for the design of real-time PCR signatures

    Directory of Open Access Journals (Sweden)

    Reifman Jaques

    2010-06-01

    Full Text Available Abstract Background Pathogen diagnostic assays based on polymerase chain reaction (PCR technology provide high sensitivity and specificity. However, the design of these diagnostic assays is computationally intensive, requiring high-throughput methods to identify unique PCR signatures in the presence of an ever increasing availability of sequenced genomes. Results We present the Tool for PCR Signature Identification (TOPSI, a high-performance computing pipeline for the design of PCR-based pathogen diagnostic assays. The TOPSI pipeline efficiently designs PCR signatures common to multiple bacterial genomes by obtaining the shared regions through pairwise alignments between the input genomes. TOPSI successfully designed PCR signatures common to 18 Staphylococcus aureus genomes in less than 14 hours using 98 cores on a high-performance computing system. Conclusions TOPSI is a computationally efficient, fully integrated tool for high-throughput design of PCR signatures common to multiple bacterial genomes. TOPSI is freely available for download at http://www.bhsai.org/downloads/topsi.tar.gz.

  10. High-throughput, temperature-controlled microchannel acoustophoresis device made with rapid prototyping

    DEFF Research Database (Denmark)

    Adams, Jonathan D; Ebbesen, Christian L.; Barnkob, Rune

    2012-01-01

    -slide format using low-cost, rapid-prototyping techniques. This high-throughput acoustophoresis chip (HTAC) utilizes a temperature-stabilized, standing ultrasonic wave, which imposes differential acoustic radiation forces that can separate particles according to size, density and compressibility. The device...

  11. A Functional High-Throughput Assay of Myelination in Vitro

    Science.gov (United States)

    2014-07-01

    Human induced pluripotent stem cells, hydrogels, 3D culture, electrophysiology, high-throughput assay 16. SECURITY CLASSIFICATION OF: 17...image the 3D rat dorsal root ganglion ( DRG ) cultures with sufficiently low background as to detect electrically-evoked depolarization events, as...of voltage-sensitive dyes. 8    We have made substantial progress in Task 4.1. We have fabricated neural fiber tracts from DRG explants and

  12. High-throughput anisotropic plasma etching of polyimide for MEMS

    International Nuclear Information System (INIS)

    Bliznetsov, Vladimir; Manickam, Anbumalar; Ranganathan, Nagarajan; Chen, Junwei

    2011-01-01

    This note describes a new high-throughput process of polyimide etching for the fabrication of MEMS devices with an organic sacrificial layer approach. Using dual frequency superimposed capacitively coupled plasma we achieved a vertical profile of polyimide with an etching rate as high as 3.5 µm min −1 . After the fabrication of vertical structures in a polyimide material, additional steps were performed to fabricate structural elements of MEMS by deposition of a SiO 2 layer and performing release etching of polyimide. (technical note)

  13. REDItools: high-throughput RNA editing detection made easy.

    Science.gov (United States)

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  14. Applications of high-throughput clonogenic survival assays in high-LET particle microbeams

    Directory of Open Access Journals (Sweden)

    Antonios eGeorgantzoglou

    2016-01-01

    Full Text Available Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-LET particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells’ clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells’ response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell’s capacity to divide at least 4-5 times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  15. Applications of High-Throughput Clonogenic Survival Assays in High-LET Particle Microbeams.

    Science.gov (United States)

    Georgantzoglou, Antonios; Merchant, Michael J; Jeynes, Jonathan C G; Mayhead, Natalie; Punia, Natasha; Butler, Rachel E; Jena, Rajesh

    2015-01-01

    Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-linear energy transfer (LET) particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells' clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells' response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell's capacity to divide at least four to five times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  16. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods. - Highlights: • A detailed discussion on the template-matching algorithm was given. • The algorithm was tested on data from a NaI and a Si detector. • The algorithm was successfully implemented on high rate data from a HPGe detector. • The performance of the algorithm was compared with traditional shaping methods. • The advantage of the algorithm in active interrogation was discussed.

  17. Caveats and limitations of plate reader-based high-throughput kinetic measurements of intracellular calcium levels

    International Nuclear Information System (INIS)

    Heusinkveld, Harm J.; Westerink, Remco H.S.

    2011-01-01

    Calcium plays a crucial role in virtually all cellular processes, including neurotransmission. The intracellular Ca 2+ concentration ([Ca 2+ ] i ) is therefore an important readout in neurotoxicological and neuropharmacological studies. Consequently, there is an increasing demand for high-throughput measurements of [Ca 2+ ] i , e.g. using multi-well microplate readers, in hazard characterization, human risk assessment and drug development. However, changes in [Ca 2+ ] i are highly dynamic, thereby creating challenges for high-throughput measurements. Nonetheless, several protocols are now available for real-time kinetic measurement of [Ca 2+ ] i in plate reader systems, though the results of such plate reader-based measurements have been questioned. In view of the increasing use of plate reader systems for measurements of [Ca 2+ ] i a careful evaluation of current technologies is warranted. We therefore performed an extensive set of experiments, using two cell lines (PC12 and B35) and two fluorescent calcium-sensitive dyes (Fluo-4 and Fura-2), for comparison of a linear plate reader system with single cell fluorescence microscopy. Our data demonstrate that the use of plate reader systems for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with many pitfalls and limitations, including erroneous sustained increases in fluorescence, limited sensitivity and lack of single cell resolution. Additionally, our data demonstrate that probenecid, which is often used to prevent dye leakage, effectively inhibits the depolarization-evoked increase in [Ca 2+ ] i . Overall, the data indicate that the use of current plate reader-based strategies for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with caveats and limitations that require further investigation. - Research highlights: → The use of plate readers for high-throughput screening of intracellular Ca 2+ is associated with many pitfalls and limitations. → Single cell

  18. 3D material cytometry (3DMaC): a very high-replicate, high-throughput analytical method using microfabricated, shape-specific, cell-material niches.

    Science.gov (United States)

    Parratt, Kirsten; Jeong, Jenny; Qiu, Peng; Roy, Krishnendu

    2017-08-08

    Studying cell behavior within 3D material niches is key to understanding cell biology in health and diseases, and developing biomaterials for regenerative medicine applications. Current approaches to studying these cell-material niches have low throughput and can only analyze a few replicates per experiment resulting in reduced measurement assurance and analytical power. Here, we report 3D material cytometry (3DMaC), a novel high-throughput method based on microfabricated, shape-specific 3D cell-material niches and imaging cytometry. 3DMaC achieves rapid and highly multiplexed analyses of very high replicate numbers ("n" of 10 4 -10 6 ) of 3D biomaterial constructs. 3DMaC overcomes current limitations of low "n", low-throughput, and "noisy" assays, to provide rapid and simultaneous analyses of potentially hundreds of parameters in 3D biomaterial cultures. The method is demonstrated here for a set of 85 000 events containing twelve distinct cell-biomaterial micro-niches along with robust, customized computational methods for high-throughput analytics with potentially unprecedented statistical power.

  19. A robust robotic high-throughput antibody purification platform.

    Science.gov (United States)

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  1. High Throughput WAN Data Transfer with Hadoop-based Storage

    Science.gov (United States)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  2. High Throughput WAN Data Transfer with Hadoop-based Storage

    International Nuclear Information System (INIS)

    Amin, A; Thomas, M; Bockelman, B; Letts, J; Martin, T; Pi, H; Sfiligoi, I; Wüerthwein, F; Levshina, T

    2011-01-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  3. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  4. Application of high-throughput sequencing in understanding human oral microbiome related with health and disease

    OpenAIRE

    Chen, Hui; Jiang, Wen

    2014-01-01

    The oral microbiome is one of most diversity habitat in the human body and they are closely related with oral health and disease. As the technique developing,, high throughput sequencing has become a popular approach applied for oral microbial analysis. Oral bacterial profiles have been studied to explore the relationship between microbial diversity and oral diseases such as caries and periodontal disease. This review describes the application of high-throughput sequencing for characterizati...

  5. Accessing Specific Peptide Recognition by Combinatorial Chemistry

    DEFF Research Database (Denmark)

    Li, Ming

    Molecular recognition is at the basis of all processes for life, and plays a central role in many biological processes, such as protein folding, the structural organization of cells and organelles, signal transduction, and the immune response. Hence, my PhD project is entitled “Accessing Specific...... Peptide Recognition by Combinatorial Chemistry”. Molecular recognition is a specific interaction between two or more molecules through noncovalent bonding, such as hydrogen bonding, metal coordination, van der Waals forces, π−π, hydrophobic, or electrostatic interactions. The association involves kinetic....... Combinatorial chemistry was invented in 1980s based on observation of functional aspects of the adaptive immune system. It was employed for drug development and optimization in conjunction with high-throughput synthesis and screening. (chapter 2) Combinatorial chemistry is able to rapidly produce many thousands...

  6. Accurate macromolecular crystallographic refinement: incorporation of the linear scaling, semiempirical quantum-mechanics program DivCon into the PHENIX refinement package

    Energy Technology Data Exchange (ETDEWEB)

    Borbulevych, Oleg Y.; Plumley, Joshua A.; Martin, Roger I. [QuantumBio Inc., 2790 West College Avenue, State College, PA 16801 (United States); Merz, Kenneth M. Jr [University of Florida, Gainesville, Florida (United States); Westerhoff, Lance M., E-mail: lance@quantumbioinc.com [QuantumBio Inc., 2790 West College Avenue, State College, PA 16801 (United States)

    2014-05-01

    Semiempirical quantum-chemical X-ray macromolecular refinement using the program DivCon integrated with PHENIX is described. Macromolecular crystallographic refinement relies on sometimes dubious stereochemical restraints and rudimentary energy functionals to ensure the correct geometry of the model of the macromolecule and any covalently bound ligand(s). The ligand stereochemical restraint file (CIF) requires a priori understanding of the ligand geometry within the active site, and creation of the CIF is often an error-prone process owing to the great variety of potential ligand chemistry and structure. Stereochemical restraints have been replaced with more robust functionals through the integration of the linear-scaling, semiempirical quantum-mechanics (SE-QM) program DivCon with the PHENIX X-ray refinement engine. The PHENIX/DivCon package has been thoroughly validated on a population of 50 protein–ligand Protein Data Bank (PDB) structures with a range of resolutions and chemistry. The PDB structures used for the validation were originally refined utilizing various refinement packages and were published within the past five years. PHENIX/DivCon does not utilize CIF(s), link restraints and other parameters for refinement and hence it does not make as many a priori assumptions about the model. Across the entire population, the method results in reasonable ligand geometries and low ligand strains, even when the original refinement exhibited difficulties, indicating that PHENIX/DivCon is applicable to both single-structure and high-throughput crystallography.

  7. Molecular Studies of Complex Soil Organic Matter Interactions with Metal Ions and Mineral Surfaces using Classical Molecular Dynamics and Quantum Chemistry Methods

    Science.gov (United States)

    Andersen, A.; Govind, N.; Laskin, A.

    2017-12-01

    Mineral surfaces have been implicated as potential protectors of soil organic matter (SOM) against decomposition and ultimate mineralization to small molecules which can provide nutrients for plants and soil microbes and can also contribute to the Earth's elemental cycles. SOM is a complex mixture of organic molecules of biological origin at varying degrees of decomposition and can, itself, self-assemble in such a way as to expose some biomolecule types to biotic and abiotic attack while protecting other biomolecule types. The organization of SOM and SOM with mineral surfaces and solvated metal ions is driven by an interplay of van der Waals and electrostatic interactions leading to partitioning of hydrophilic (e.g. sugars) and hydrophobic (e.g., lipids) SOM components that can be bridged with amphiphilic molecules (e.g., proteins). Classical molecular dynamics simulations can shed light on assemblies of organic molecules alone or complexation with mineral surfaces. The role of chemical reactions is also an important consideration in potential chemical changes of the organic species such as oxidation/reduction, degradation, chemisorption to mineral surfaces, and complexation with solvated metal ions to form organometallic systems. For the study of chemical reactivity, quantum chemistry methods can be employed and combined with structural insight provided by classical MD simulations. Moreover, quantum chemistry can also simulate spectroscopic signatures based on chemical structure and is a valuable tool in interpreting spectra from, notably, x-ray absorption spectroscopy (XAS). In this presentation, we will discuss our classical MD and quantum chemistry findings on a model SOM system interacting with mineral surfaces and solvated metal ions.

  8. PCR cycles above routine numbers do not compromise high-throughput DNA barcoding results.

    Science.gov (United States)

    Vierna, J; Doña, J; Vizcaíno, A; Serrano, D; Jovani, R

    2017-10-01

    High-throughput DNA barcoding has become essential in ecology and evolution, but some technical questions still remain. Increasing the number of PCR cycles above the routine 20-30 cycles is a common practice when working with old-type specimens, which provide little amounts of DNA, or when facing annealing issues with the primers. However, increasing the number of cycles can raise the number of artificial mutations due to polymerase errors. In this work, we sequenced 20 COI libraries in the Illumina MiSeq platform. Libraries were prepared with 40, 45, 50, 55, and 60 PCR cycles from four individuals belonging to four species of four genera of cephalopods. We found no relationship between the number of PCR cycles and the number of mutations despite using a nonproofreading polymerase. Moreover, even when using a high number of PCR cycles, the resulting number of mutations was low enough not to be an issue in the context of high-throughput DNA barcoding (but may still remain an issue in DNA metabarcoding due to chimera formation). We conclude that the common practice of increasing the number of PCR cycles should not negatively impact the outcome of a high-throughput DNA barcoding study in terms of the occurrence of point mutations.

  9. Combustion chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Brown, N.J. [Lawrence Berkeley Laboratory, CA (United States)

    1993-12-01

    This research is concerned with the development and use of sensitivity analysis tools to probe the response of dependent variables to model input variables. Sensitivity analysis is important at all levels of combustion modeling. This group`s research continues to be focused on elucidating the interrelationship between features in the underlying potential energy surface (obtained from ab initio quantum chemistry calculations) and their responses in the quantum dynamics, e.g., reactive transition probabilities, cross sections, and thermal rate coefficients. The goals of this research are: (i) to provide feedback information to quantum chemists in their potential surface refinement efforts, and (ii) to gain a better understanding of how various regions in the potential influence the dynamics. These investigations are carried out with the methodology of quantum functional sensitivity analysis (QFSA).

  10. Quantum entanglement of high angular momenta.

    Science.gov (United States)

    Fickler, Robert; Lapkiewicz, Radek; Plick, William N; Krenn, Mario; Schaeff, Christoph; Ramelow, Sven; Zeilinger, Anton

    2012-11-02

    Single photons with helical phase structures may carry a quantized amount of orbital angular momentum (OAM), and their entanglement is important for quantum information science and fundamental tests of quantum theory. Because there is no theoretical upper limit on how many quanta of OAM a single photon can carry, it is possible to create entanglement between two particles with an arbitrarily high difference in quantum number. By transferring polarization entanglement to OAM with an interferometric scheme, we generate and verify entanglement between two photons differing by 600 in quantum number. The only restrictive factors toward higher numbers are current technical limitations. We also experimentally demonstrate that the entanglement of very high OAM can improve the sensitivity of angular resolution in remote sensing.

  11. The physical basis of chemistry

    CERN Document Server

    Warren, Warren S

    2000-01-01

    If the text you're using for general chemistry seems to lack sufficient mathematics and physics in its presentation of classical mechanics, molecular structure, and statistics, this complementary science series title may be just what you're looking for. Written for the advanced lower-division undergraduate chemistry course, The Physical Basis of Chemistry, Second Edition, offers students an opportunity to understand and enrich the understanding of physical chemistry with some quantum mechanics, the Boltzmann distribution, and spectroscopy. Posed and answered are questions concerning eve

  12. Quantum dot conjugates in a sub-micrometer fluidic channel

    Science.gov (United States)

    Stavis, Samuel M.; Edel, Joshua B.; Samiee, Kevan T.; Craighead, Harold G.

    2010-04-13

    A nanofluidic channel fabricated in fused silica with an approximately 500 nm square cross section was used to isolate, detect and identify individual quantum dot conjugates. The channel enables the rapid detection of every fluorescent entity in solution. A laser of selected wavelength was used to excite multiple species of quantum dots and organic molecules, and the emission spectra were resolved without significant signal rejection. Quantum dots were then conjugated with organic molecules and detected to demonstrate efficient multicolor detection. PCH was used to analyze coincident detection and to characterize the degree of binding. The use of a small fluidic channel to detect quantum dots as fluorescent labels was shown to be an efficient technique for multiplexed single molecule studies. Detection of single molecule binding events has a variety of applications including high throughput immunoassays.

  13. Quantum dot conjugates in a sub-micrometer fluidic channel

    Science.gov (United States)

    Stavis, Samuel M [Ithaca, NY; Edel, Joshua B [Brookline, MA; Samiee, Kevan T [Ithaca, NY; Craighead, Harold G [Ithaca, NY

    2008-07-29

    A nanofluidic channel fabricated in fused silica with an approximately 500 nm square cross section was used to isolate, detect and identify individual quantum dot conjugates. The channel enables the rapid detection of every fluorescent entity in solution. A laser of selected wavelength was used to excite multiple species of quantum dots and organic molecules, and the emission spectra were resolved without significant signal rejection. Quantum dots were then conjugated with organic molecules and detected to demonstrate efficient multicolor detection. PCH was used to analyze coincident detection and to characterize the degree of binding. The use of a small fluidic channel to detect quantum dots as fluorescent labels was shown to be an efficient technique for multiplexed single molecule studies. Detection of single molecule binding events has a variety of applications including high throughput immunoassays.

  14. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  15. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  16. A continuous high-throughput bioparticle sorter based on 3D traveling-wave dielectrophoresis.

    Science.gov (United States)

    Cheng, I-Fang; Froude, Victoria E; Zhu, Yingxi; Chang, Hsueh-Chia; Chang, Hsien-Chang

    2009-11-21

    We present a high throughput (maximum flow rate approximately 10 microl/min or linear velocity approximately 3 mm/s) continuous bio-particle sorter based on 3D traveling-wave dielectrophoresis (twDEP) at an optimum AC frequency of 500 kHz. The high throughput sorting is achieved with a sustained twDEP particle force normal to the continuous through-flow, which is applied over the entire chip by a single 3D electrode array. The design allows continuous fractionation of micron-sized particles into different downstream sub-channels based on differences in their twDEP mobility on both sides of the cross-over. Conventional DEP is integrated upstream to focus the particles into a single levitated queue to allow twDEP sorting by mobility difference and to minimize sedimentation and field-induced lysis. The 3D electrode array design minimizes the offsetting effect of nDEP (negative DEP with particle force towards regions with weak fields) on twDEP such that both forces increase monotonically with voltage to further increase the throughput. Effective focusing and separation of red blood cells from debris-filled heterogeneous samples are demonstrated, as well as size-based separation of poly-dispersed liposome suspensions into two distinct bands at 2.3 to 4.6 microm and 1.5 to 2.7 microm, at the highest throughput recorded in hand-held chips of 6 microl/min.

  17. A cell-based high-throughput screening assay for radiation susceptibility using automated cell counting

    International Nuclear Information System (INIS)

    Hodzic, Jasmina; Dingjan, Ilse; Maas, Mariëlle JP; Meulen-Muileman, Ida H van der; Menezes, Renee X de; Heukelom, Stan; Verheij, Marcel; Gerritsen, Winald R; Geldof, Albert A; Triest, Baukelien van; Beusechem, Victor W van

    2015-01-01

    Radiotherapy is one of the mainstays in the treatment for cancer, but its success can be limited due to inherent or acquired resistance. Mechanisms underlying radioresistance in various cancers are poorly understood and available radiosensitizers have shown only modest clinical benefit. There is thus a need to identify new targets and drugs for more effective sensitization of cancer cells to irradiation. Compound and RNA interference high-throughput screening technologies allow comprehensive enterprises to identify new agents and targets for radiosensitization. However, the gold standard assay to investigate radiosensitivity of cancer cells in vitro, the colony formation assay (CFA), is unsuitable for high-throughput screening. We developed a new high-throughput screening method for determining radiation susceptibility. Fast and uniform irradiation of batches up to 30 microplates was achieved using a Perspex container and a clinically employed linear accelerator. The readout was done by automated counting of fluorescently stained nuclei using the Acumen eX3 laser scanning cytometer. Assay performance was compared to that of the CFA and the CellTiter-Blue homogeneous uniform-well cell viability assay. The assay was validated in a whole-genome siRNA library screening setting using PC-3 prostate cancer cells. On 4 different cancer cell lines, the automated cell counting assay produced radiation dose response curves that followed a linear-quadratic equation and that exhibited a better correlation to the results of the CFA than did the cell viability assay. Moreover, the cell counting assay could be used to detect radiosensitization by silencing DNA-PKcs or by adding caffeine. In a high-throughput screening setting, using 4 Gy irradiated and control PC-3 cells, the effects of DNA-PKcs siRNA and non-targeting control siRNA could be clearly discriminated. We developed a simple assay for radiation susceptibility that can be used for high-throughput screening. This will aid

  18. High-throughput sockets over RDMA for the Intel Xeon Phi coprocessor

    CERN Document Server

    Santogidis, Aram

    2017-01-01

    In this paper we describe the design, implementation and performance of Trans4SCIF, a user-level socket-like transport library for the Intel Xeon Phi coprocessor. Trans4SCIF library is primarily intended for high-throughput applications. It uses RDMA transfers over the native SCIF support, in a way that is transparent for the application, which has the illusion of using conventional stream sockets. We also discuss the integration of Trans4SCIF with the ZeroMQ messaging library, used extensively by several applications running at CERN. We show that this can lead to a substantial, up to 3x, increase of application throughput compared to the default TCP/IP transport option.

  19. Development of a high-throughput screening assay for stearoyl-CoA desaturase using rat liver microsomes, deuterium labeled stearoyl-CoA and mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Soulard, Patricia; McLaughlin, Meg; Stevens, Jessica; Connolly, Brendan; Coli, Rocco; Wang Leyu [Research Technology Center, Pfizer Global Research and Development, Cambridge, MA (United States); Moore, Jennifer; Kuo, Ming-Shang T. [Pfizer Global Research and Development, San Diego, CA (United States); LaMarr, William A.; Ozbal, Can C. [Biotrove, Inc., Woburn, MA (United States); Bhat, B. Ganesh [Pfizer Global Research and Development, San Diego, CA (United States)], E-mail: gbhat@gnf.org

    2008-10-03

    Several recent reports suggest that stearoyl-CoA desaturase 1 (SCD1), the rate-limiting enzyme in monounsaturated fatty acid synthesis, plays an important role in regulating lipid homeostasis and lipid oxidation in metabolically active tissues. As several manifestations of type 2 diabetes and related metabolic disorders are associated with alterations in intracellular lipid partitioning, pharmacological manipulation of SCD1 activity might be of benefit in the treatment of these disease states. In an effort to identify small molecule inhibitors of SCD1, we have developed a mass spectrometry based high-throughput screening (HTS) assay using deuterium labeled stearoyl-CoA substrate and induced rat liver microsomes. The methodology developed allows the use of a nonradioactive substrate which avoids interference by the endogenous SCD1 substrate and/or product that exist in the non-purified enzyme source. Throughput of the assay was up to twenty 384-well assay plates per day. The assay was linear with protein concentration and time, and was saturable for stearoyl-CoA substrate (K{sub m} = 10.5 {mu}M). The assay was highly reproducible with an average Z' value = 0.6. Conjugated linoleic acid and sterculic acid, known inhibitors of SCD1, exhibited IC{sub 50} values of 0.88 and 0.12 {mu}M, respectively. High-throughput mass spectrometry screening of over 1.7 million compounds in compressed format demonstrated that the enzyme target is druggable. A total of 2515 hits were identified (0.1% hit rate), and 346 were confirmed active (>40% inhibition of total SCD activity at 20 {mu}M - 14% conformation rate). Of the confirmed hits 172 had IC{sub 50} values of <10 {mu}M, including 111 <1 {mu}M and 48 <100 nM. A large number of potent drug-like (MW < 450) hits representing six different chemical series were identified. The application of mass spectrometry to high-throughput screening permitted the development of a high-quality screening protocol for an otherwise intractable

  20. Development of a high-throughput screening assay for stearoyl-CoA desaturase using rat liver microsomes, deuterium labeled stearoyl-CoA and mass spectrometry

    International Nuclear Information System (INIS)

    Soulard, Patricia; McLaughlin, Meg; Stevens, Jessica; Connolly, Brendan; Coli, Rocco; Wang Leyu; Moore, Jennifer; Kuo, Ming-Shang T.; LaMarr, William A.; Ozbal, Can C.; Bhat, B. Ganesh

    2008-01-01

    Several recent reports suggest that stearoyl-CoA desaturase 1 (SCD1), the rate-limiting enzyme in monounsaturated fatty acid synthesis, plays an important role in regulating lipid homeostasis and lipid oxidation in metabolically active tissues. As several manifestations of type 2 diabetes and related metabolic disorders are associated with alterations in intracellular lipid partitioning, pharmacological manipulation of SCD1 activity might be of benefit in the treatment of these disease states. In an effort to identify small molecule inhibitors of SCD1, we have developed a mass spectrometry based high-throughput screening (HTS) assay using deuterium labeled stearoyl-CoA substrate and induced rat liver microsomes. The methodology developed allows the use of a nonradioactive substrate which avoids interference by the endogenous SCD1 substrate and/or product that exist in the non-purified enzyme source. Throughput of the assay was up to twenty 384-well assay plates per day. The assay was linear with protein concentration and time, and was saturable for stearoyl-CoA substrate (K m = 10.5 μM). The assay was highly reproducible with an average Z' value = 0.6. Conjugated linoleic acid and sterculic acid, known inhibitors of SCD1, exhibited IC 50 values of 0.88 and 0.12 μM, respectively. High-throughput mass spectrometry screening of over 1.7 million compounds in compressed format demonstrated that the enzyme target is druggable. A total of 2515 hits were identified (0.1% hit rate), and 346 were confirmed active (>40% inhibition of total SCD activity at 20 μM - 14% conformation rate). Of the confirmed hits 172 had IC 50 values of <10 μM, including 111 <1 μM and 48 <100 nM. A large number of potent drug-like (MW < 450) hits representing six different chemical series were identified. The application of mass spectrometry to high-throughput screening permitted the development of a high-quality screening protocol for an otherwise intractable target, SCD1. Further medicinal

  1. High-throughput diagnosis of potato cyst nematodes in soil samples.

    Science.gov (United States)

    Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon

    2015-01-01

    Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.

  2. A multilayer microdevice for cell-based high-throughput drug screening

    International Nuclear Information System (INIS)

    Liu, Chong; Wang, Lei; Li, Jingmin; Ding, Xiping; Chunyu, Li; Xu, Zheng; Wang, Qi

    2012-01-01

    A multilayer polydimethylsiloxane microdevice for cell-based high-throughput drug screening is described in this paper. This established microdevice was based on a modularization method and it integrated a drug/medium concentration gradient generator (CGG), pneumatic microvalves and a cell culture microchamber array. The CGG was able to generate five steps of linear concentrations with the same outlet flow rate. The medium/drug flowed through CGG and then into the pear-shaped cell culture microchambers vertically. This vertical perfusion mode was used to reduce the impact of the shear stress on the physiology of cells induced by the fluid flow in the microchambers. Pear-shaped microchambers with two arrays of miropillars at each outlet were adopted in this microdevice, which were beneficial to cell distribution. The chemotherapeutics Cisplatin (DDP)-induced Cisplatin-resistant cell line A549/DDP apoptotic experiments were performed well on this platform. The results showed that this novel microdevice could not only provide well-defined and stable conditions for cell culture, but was also useful for cell-based high-throughput drug screening with less reagents and time consumption. (paper)

  3. A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.

    2010-01-01

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460

  4. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  5. Performance Evaluation of IEEE 802.11ah Networks With High-Throughput Bidirectional Traffic.

    Science.gov (United States)

    Šljivo, Amina; Kerkhove, Dwight; Tian, Le; Famaey, Jeroen; Munteanu, Adrian; Moerman, Ingrid; Hoebeke, Jeroen; De Poorter, Eli

    2018-01-23

    So far, existing sub-GHz wireless communication technologies focused on low-bandwidth, long-range communication with large numbers of constrained devices. Although these characteristics are fine for many Internet of Things (IoT) applications, more demanding application requirements could not be met and legacy Internet technologies such as Transmission Control Protocol/Internet Protocol (TCP/IP) could not be used. This has changed with the advent of the new IEEE 802.11ah Wi-Fi standard, which is much more suitable for reliable bidirectional communication and high-throughput applications over a wide area (up to 1 km). The standard offers great possibilities for network performance optimization through a number of physical- and link-layer configurable features. However, given that the optimal configuration parameters depend on traffic patterns, the standard does not dictate how to determine them. Such a large number of configuration options can lead to sub-optimal or even incorrect configurations. Therefore, we investigated how two key mechanisms, Restricted Access Window (RAW) grouping and Traffic Indication Map (TIM) segmentation, influence scalability, throughput, latency and energy efficiency in the presence of bidirectional TCP/IP traffic. We considered both high-throughput video streaming traffic and large-scale reliable sensing traffic and investigated TCP behavior in both scenarios when the link layer introduces long delays. This article presents the relations between attainable throughput per station and attainable number of stations, as well as the influence of RAW, TIM and TCP parameters on both. We found that up to 20 continuously streaming IP-cameras can be reliably connected via IEEE 802.11ah with a maximum average data rate of 160 kbps, whereas 10 IP-cameras can achieve average data rates of up to 255 kbps over 200 m. Up to 6960 stations transmitting every 60 s can be connected over 1 km with no lost packets. The presented results enable the fine tuning

  6. A high throughput platform for understanding the influence of excipients on physical and chemical stability

    DEFF Research Database (Denmark)

    Raijada, Dhara; Cornett, Claus; Rantanen, Jukka

    2013-01-01

    The present study puts forward a miniaturized high-throughput platform to understand influence of excipient selection and processing on the stability of a given drug compound. Four model drugs (sodium naproxen, theophylline, amlodipine besylate and nitrofurantoin) and ten different excipients were...... for chemical degradation. The proposed high-throughput platform can be used during early drug development to simulate typical processing induced stress in a small scale and to understand possible phase transformation behaviour and influence of excipients on this....

  7. Application of fermionic marginal constraints to hybrid quantum algorithms

    Science.gov (United States)

    Rubin, Nicholas C.; Babbush, Ryan; McClean, Jarrod

    2018-05-01

    Many quantum algorithms, including recently proposed hybrid classical/quantum algorithms, make use of restricted tomography of the quantum state that measures the reduced density matrices, or marginals, of the full state. The most straightforward approach to this algorithmic step estimates each component of the marginal independently without making use of the algebraic and geometric structure of the marginals. Within the field of quantum chemistry, this structure is termed the fermionic n-representability conditions, and is supported by a vast amount of literature on both theoretical and practical results related to their approximations. In this work, we introduce these conditions in the language of quantum computation, and utilize them to develop several techniques to accelerate and improve practical applications for quantum chemistry on quantum computers. As a general result, we demonstrate how these marginals concentrate to diagonal quantities when measured on random quantum states. We also show that one can use fermionic n-representability conditions to reduce the total number of measurements required by more than an order of magnitude for medium sized systems in chemistry. As a practical demonstration, we simulate an efficient restoration of the physicality of energy curves for the dilation of a four qubit diatomic hydrogen system in the presence of three distinct one qubit error channels, providing evidence these techniques are useful for pre-fault tolerant quantum chemistry experiments.

  8. Fabrication of combinatorial nm-planar electrode array for high throughput evaluation of organic semiconductors

    International Nuclear Information System (INIS)

    Haemori, M.; Edura, T.; Tsutsui, K.; Itaka, K.; Wada, Y.; Koinuma, H.

    2006-01-01

    We have fabricated a combinatorial nm-planar electrode array by using photolithography and chemical mechanical polishing processes for high throughput electrical evaluation of organic devices. Sub-nm precision was achieved with respect to the average level difference between each pair of electrodes and a dielectric layer. The insulating property between the electrodes is high enough to measure I-V characteristics of organic semiconductors. Bottom-contact field-effect-transistors (FETs) of pentacene were fabricated on this electrode array by use of molecular beam epitaxy. It was demonstrated that the array could be used as a pre-patterned device substrate for high throughput screening of the electrical properties of organic semiconductors

  9. 40 CFR Table 9 to Subpart Eeee of... - Continuous Compliance With Operating Limits-High Throughput Transfer Racks

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With Operating Limits-High Throughput Transfer Racks 9 Table 9 to Subpart EEEE of Part 63 Protection of Environment...—Continuous Compliance With Operating Limits—High Throughput Transfer Racks As stated in §§ 63.2378(a) and (b...

  10. Quantum mechanics a comprehensive text for chemistry

    CERN Document Server

    Arora, Kishor

    2010-01-01

    This book contains 14 chapters. The text includes the inadequacy of classical mechanics and covers basic and fundamental concepts of quantum mechanics including concepts of transitional, vibration rotation and electronic energies, introduction to concepts of angular momenta, approximatemethods and their application concepts related to electron spin, symmetery concepts and quantum mechanics and ultimately the book features the theories of chemical bonding and use of softwares in quantum mechanics. the text of the book is presented in a lucid manner with ample examples and illustrations wherever

  11. Chemistry with bigger, better atoms

    Indian Academy of Sciences (India)

    DELL

    Anshu Pandey. Solid State and Structural Chemistry Unit. Indian Institute of Science. H. Cd. Hg. U ? Page 2. Quantum Dots: A Coarse-grained view. • Quantum Dot Electronic Structure can be approximated remarkably well as a Spherical. Particle in a Box Problem ... The concept of stoichiometry still holds!!! Rekha M. et. al.

  12. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  13. Psi4NumPy: An Interactive Quantum Chemistry Programming Environment for Reference Implementations and Rapid Development.

    Science.gov (United States)

    Smith, Daniel G A; Burns, Lori A; Sirianni, Dominic A; Nascimento, Daniel R; Kumar, Ashutosh; James, Andrew M; Schriber, Jeffrey B; Zhang, Tianyuan; Zhang, Boyi; Abbott, Adam S; Berquist, Eric J; Lechner, Marvin H; Cunha, Leonardo A; Heide, Alexander G; Waldrop, Jonathan M; Takeshita, Tyler Y; Alenaizan, Asem; Neuhauser, Daniel; King, Rollin A; Simmonett, Andrew C; Turney, Justin M; Schaefer, Henry F; Evangelista, Francesco A; DePrince, A Eugene; Crawford, T Daniel; Patkowski, Konrad; Sherrill, C David

    2018-06-11

    Psi4NumPy demonstrates the use of efficient computational kernels from the open-source Psi4 program through the popular NumPy library for linear algebra in Python to facilitate the rapid development of clear, understandable Python computer code for new quantum chemical methods, while maintaining a relatively low execution time. Using these tools, reference implementations have been created for a number of methods, including self-consistent field (SCF), SCF response, many-body perturbation theory, coupled-cluster theory, configuration interaction, and symmetry-adapted perturbation theory. Furthermore, several reference codes have been integrated into Jupyter notebooks, allowing background, underlying theory, and formula information to be associated with the implementation. Psi4NumPy tools and associated reference implementations can lower the barrier for future development of quantum chemistry methods. These implementations also demonstrate the power of the hybrid C++/Python programming approach employed by the Psi4 program.

  14. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  15. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    Science.gov (United States)

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980

  16. Effects of quantum chemistry models for bound electrons on positron annihilation spectra for atoms and small molecules

    International Nuclear Information System (INIS)

    Wang Feng; Ma Xiaoguang; Selvam, Lalitha; Gribakin, Gleb; Surko, Clifford M

    2012-01-01

    The Doppler-shift spectra of the γ-rays from positron annihilation in molecules were determined by using the momentum distribution of the annihilation electron–positron pair. The effect of the positron wavefunction on spectra was analysed in a recent paper (Green et al 2012 New J. Phys. 14 035021). In this companion paper, we focus on the dominant contribution to the spectra, which arises from the momenta of the bound electrons. In particular, we use computational quantum chemistry models (Hartree–Fock with two basis sets and density functional theory (DFT)) to calculate the wavefunctions of the bound electrons. Numerical results are presented for noble gases and small molecules such as H 2 , N 2 , O 2 , CH 4 and CF 4 . The calculations reveal relatively small effects on the Doppler-shift spectra from the level of inclusion of electron correlation energy in the models. For atoms, the difference in the full-width at half-maximum of the spectra obtained using the Hartree–Fock and DFT models does not exceed 2%. For molecules the difference can be much larger, reaching 8% for some molecular orbitals. These results indicate that the predicted positron annihilation spectra for molecules are generally more sensitive to inclusion of electron correlation energies in the quantum chemistry model than the spectra for atoms are. (paper)

  17. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  18. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  19. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  20. Correction of Microplate Data from High-Throughput Screening.

    Science.gov (United States)

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  1. Amine-derived synthetic approach to color-tunable InP/ZnS quantum dots with high fluorescent qualities

    International Nuclear Information System (INIS)

    Song, Woo-Seuk; Lee, Hye-Seung; Lee, Ju Chul; Jang, Dong Seon; Choi, Yoonyoung; Choi, Moongoo; Yang, Heesun

    2013-01-01

    High-quality, Cd-free InP quantum dots (QDs) have been conventionally synthesized by exclusively selecting tris(trimethylsilyl)phosphine (P(TMS) 3 ) as a phosphorus (P) precursor, which is problematic from the standpoint of green and economic chemistry. Thus, other synthetic chemistries adopting alternative P sources to P(TMS) 3 have been introduced, however, they could not guarantee the production of satisfactorily fluorescence-efficient, color-pure InP QDs. In this study, the unprecedented controlled synthesis of a series of band-gap-tuned InP QDs is demonstrated through a hot-injection of a far safer and cheaper tris(dimethylamino)phosphine in the presence of a key coordinating solvent of oleylamine that enables successful QD nucleation/growth. Effects of the co-existence of Zn additive, the core growth temperature, and the amount of P source injected on the growth behaviors of InP QD are investigated. After ZnS overcoating by a successive injection of 1-dodecanethiol only, high-fluorescence-quality, green-to-red color emission-tunable core/shell QDs of InP/ZnS are obtained. The fluorescent characteristics of different color-emitting QDs desirably exhibit little fluctuations in quantum yield and emission bandwidth, specifically ranging 51–53 % and 60–64 nm, respectively. Lastly, the utility of the introduction of a secondary shelling process in rendering the QDs are more bright, photostable is also proved.

  2. Amine-derived synthetic approach to color-tunable InP/ZnS quantum dots with high fluorescent qualities

    Science.gov (United States)

    Song, Woo-Seuk; Lee, Hye-Seung; Lee, Ju Chul; Jang, Dong Seon; Choi, Yoonyoung; Choi, Moongoo; Yang, Heesun

    2013-06-01

    High-quality, Cd-free InP quantum dots (QDs) have been conventionally synthesized by exclusively selecting tris(trimethylsilyl)phosphine (P(TMS)3) as a phosphorus (P) precursor, which is problematic from the standpoint of green and economic chemistry. Thus, other synthetic chemistries adopting alternative P sources to P(TMS)3 have been introduced, however, they could not guarantee the production of satisfactorily fluorescence-efficient, color-pure InP QDs. In this study, the unprecedented controlled synthesis of a series of band-gap-tuned InP QDs is demonstrated through a hot-injection of a far safer and cheaper tris(dimethylamino)phosphine in the presence of a key coordinating solvent of oleylamine that enables successful QD nucleation/growth. Effects of the co-existence of Zn additive, the core growth temperature, and the amount of P source injected on the growth behaviors of InP QD are investigated. After ZnS overcoating by a successive injection of 1-dodecanethiol only, high-fluorescence-quality, green-to-red color emission-tunable core/shell QDs of InP/ZnS are obtained. The fluorescent characteristics of different color-emitting QDs desirably exhibit little fluctuations in quantum yield and emission bandwidth, specifically ranging 51-53 % and 60-64 nm, respectively. Lastly, the utility of the introduction of a secondary shelling process in rendering the QDs are more bright, photostable is also proved.

  3. Amine-derived synthetic approach to color-tunable InP/ZnS quantum dots with high fluorescent qualities

    Energy Technology Data Exchange (ETDEWEB)

    Song, Woo-Seuk; Lee, Hye-Seung [Hongik University, Department of Materials Science and Engineering (Korea, Republic of); Lee, Ju Chul; Jang, Dong Seon; Choi, Yoonyoung; Choi, Moongoo [LGE Advanced Research Institute, LG Electronics, Materials and Devices Laboratory (Korea, Republic of); Yang, Heesun, E-mail: hyang@hongik.ac.kr [Hongik University, Department of Materials Science and Engineering (Korea, Republic of)

    2013-06-15

    High-quality, Cd-free InP quantum dots (QDs) have been conventionally synthesized by exclusively selecting tris(trimethylsilyl)phosphine (P(TMS){sub 3}) as a phosphorus (P) precursor, which is problematic from the standpoint of green and economic chemistry. Thus, other synthetic chemistries adopting alternative P sources to P(TMS){sub 3} have been introduced, however, they could not guarantee the production of satisfactorily fluorescence-efficient, color-pure InP QDs. In this study, the unprecedented controlled synthesis of a series of band-gap-tuned InP QDs is demonstrated through a hot-injection of a far safer and cheaper tris(dimethylamino)phosphine in the presence of a key coordinating solvent of oleylamine that enables successful QD nucleation/growth. Effects of the co-existence of Zn additive, the core growth temperature, and the amount of P source injected on the growth behaviors of InP QD are investigated. After ZnS overcoating by a successive injection of 1-dodecanethiol only, high-fluorescence-quality, green-to-red color emission-tunable core/shell QDs of InP/ZnS are obtained. The fluorescent characteristics of different color-emitting QDs desirably exhibit little fluctuations in quantum yield and emission bandwidth, specifically ranging 51-53 % and 60-64 nm, respectively. Lastly, the utility of the introduction of a secondary shelling process in rendering the QDs are more bright, photostable is also proved.

  4. Modern electronic structure theory and applications in organic chemistry

    CERN Document Server

    Davidson, ER

    1997-01-01

    This volume focuses on the use of quantum theory to understand and explain experiments in organic chemistry. High level ab initio calculations, when properly performed, are useful in making quantitative distinctions between various possible interpretations of structures, reactions and spectra. Chemical reasoning based on simpler quantum models is, however, essential to enumerating the likely possibilities. The simpler models also often suggest the type of wave function likely to be involved in ground and excited states at various points along reaction paths. This preliminary understanding is n

  5. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  6. High-Throughput Printing Process for Flexible Electronics

    Science.gov (United States)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  7. Chemistry of high-energy materials. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Klapoetke, Thomas M. [Munich Univ. (Germany). Chair of Inorganic Chemistry; Maryland Univ., College Park, MD (United States). Center of Energetic Concepts Development (CECD)

    2012-07-01

    This graduate-level textbook treats the basic chemistry of high energy materials - primary and secondary explosives, propellants, rocket fuel and pyrotechnics - and provides a review of new research developments. Applications in both military and civil fields are discussed. The book also offers new insights into ''green'' chemistry requirements and strategies for military applications.

  8. Engineering two-photon high-dimensional states through quantum interference

    Science.gov (United States)

    Zhang, Yingwen; Roux, Filippus S.; Konrad, Thomas; Agnew, Megan; Leach, Jonathan; Forbes, Andrew

    2016-01-01

    Many protocols in quantum science, for example, linear optical quantum computing, require access to large-scale entangled quantum states. Such systems can be realized through many-particle qubits, but this approach often suffers from scalability problems. An alternative strategy is to consider a lesser number of particles that exist in high-dimensional states. The spatial modes of light are one such candidate that provides access to high-dimensional quantum states, and thus they increase the storage and processing potential of quantum information systems. We demonstrate the controlled engineering of two-photon high-dimensional states entangled in their orbital angular momentum through Hong-Ou-Mandel interference. We prepare a large range of high-dimensional entangled states and implement precise quantum state filtering. We characterize the full quantum state before and after the filter, and are thus able to determine that only the antisymmetric component of the initial state remains. This work paves the way for high-dimensional processing and communication of multiphoton quantum states, for example, in teleportation beyond qubits. PMID:26933685

  9. High-throughput screening of tick-borne pathogens in Europe

    DEFF Research Database (Denmark)

    Michelet, Lorraine; Delannoy, Sabine; Devillers, Elodie

    2014-01-01

    was conducted on 7050 Ixodes ricinus nymphs collected from France, Denmark, and the Netherlands using a powerful new high-throughput approach. This advanced methodology permitted the simultaneous detection of 25 bacterial, and 12 parasitic species (including; Borrelia, Anaplasma, Ehrlichia, Rickettsia......, Bartonella, Candidatus Neoehrlichia, Coxiella, Francisella, Babesia, and Theileria genus) across 94 samples. We successfully determined the prevalence of expected (Borrelia burgdorferi sensu lato, Anaplasma phagocytophilum, Rickettsia helvetica, Candidatus Neoehrlichia mikurensis, Babesia divergens, Babesia...

  10. High Throughput Single-cell and Multiple-cell Micro-encapsulation

    OpenAIRE

    Lagus, Todd P.; Edd, Jon F.

    2012-01-01

    Microfluidic encapsulation methods have been previously utilized to capture cells in picoliter-scale aqueous, monodisperse drops, providing confinement from a bulk fluid environment with applications in high throughput screening, cytometry, and mass spectrometry. We describe a method to not only encapsulate single cells, but to repeatedly capture a set number of cells (here we demonstrate one- and two-cell encapsulation) to study both isolation and the interactions between cells in groups of ...

  11. Low Cost, High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI

    Science.gov (United States)

    2017-10-01

    greater gas polarizations and production amounts/ throughputs- benefiting in particular from the advent of com- pact, high-power, relatively low- cost ...Award Number: W81XWH-15-1-0271 TITLE: Low- Cost , High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI...DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the

  12. Design of a High-Throughput Biological Crystallography Beamline for Superconducting Wiggler

    International Nuclear Information System (INIS)

    Tseng, P.C.; Chang, C.H.; Fung, H.S.; Ma, C.I.; Huang, L.J.; Jean, Y.C.; Song, Y.F.; Huang, Y.S.; Tsang, K.L.; Chen, C.T.

    2004-01-01

    We are constructing a high-throughput biological crystallography beamline BL13B, which utilizes the radiation generated from a 3.2 Tesla, 32-pole superconducting multipole wiggler, for multi-wavelength anomalous diffraction (MAD), single-wavelength anomalous diffraction (SAD), and other related experiments. This beamline is a standard double crystal monochromator (DCM) x-ray beamline equipped with a collimating mirror (CM) and a focusing mirror (FM). Both the CM and FM are one meter long and made of Si substrate, and the CM is side-cooled by water. Based on detailed thermal analysis, liquid nitrogen (LN2) cooling for both crystals of the DCM has been adopted to optimize the energy resolution and photon beam throughput. This beamline will deliver, through a 100 μm diameter pinhole, photon flux of greater than 1011 photons/sec in the energy range from 6.5 keV to 19 keV, which is comparable to existing protein crystallography beamlines from bending magnet source at high energy storage rings

  13. Large-scale DNA Barcode Library Generation for Biomolecule Identification in High-throughput Screens.

    Science.gov (United States)

    Lyons, Eli; Sheridan, Paul; Tremmel, Georg; Miyano, Satoru; Sugano, Sumio

    2017-10-24

    High-throughput screens allow for the identification of specific biomolecules with characteristics of interest. In barcoded screens, DNA barcodes are linked to target biomolecules in a manner allowing for the target molecules making up a library to be identified by sequencing the DNA barcodes using Next Generation Sequencing. To be useful in experimental settings, the DNA barcodes in a library must satisfy certain constraints related to GC content, homopolymer length, Hamming distance, and blacklisted subsequences. Here we report a novel framework to quickly generate large-scale libraries of DNA barcodes for use in high-throughput screens. We show that our framework dramatically reduces the computation time required to generate large-scale DNA barcode libraries, compared with a naїve approach to DNA barcode library generation. As a proof of concept, we demonstrate that our framework is able to generate a library consisting of one million DNA barcodes for use in a fragment antibody phage display screening experiment. We also report generating a general purpose one billion DNA barcode library, the largest such library yet reported in literature. Our results demonstrate the value of our novel large-scale DNA barcode library generation framework for use in high-throughput screening applications.

  14. Silicon quantum dots: surface matters

    Czech Academy of Sciences Publication Activity Database

    Dohnalová, K.; Gregorkiewicz, T.; Kůsová, Kateřina

    2014-01-01

    Roč. 26, č. 17 (2014), 1-28 ISSN 0953-8984 R&D Projects: GA ČR GPP204/12/P235 Institutional support: RVO:68378271 Keywords : silicon quantum dots * quantum dot * surface chemistry * quantum confinement Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.346, year: 2014

  15. Conflicts in Chemistry: The Case of Plastics, a Role-Playing Game for High School Chemistry Students

    Science.gov (United States)

    Cook, Deborah H.

    2014-01-01

    Conflicts in Chemistry: The Case of Plastics, an innovative role-playing activity for high school students, was developed by the Chemical Heritage Foundation to promote increased public understanding of chemistry. The pilot program included three high school teachers and their students at three different schools and documented implementation and…

  16. Teaching Introductory Quantum Physics and Chemistry: Caveats from the History of Science and Science Teaching to the Training of Modern Chemists

    Science.gov (United States)

    Greca, Ileana M.; Freire, Olival, Jr.

    2014-01-01

    Finding the best ways to introduce quantum physics to undergraduate students in all scientific areas, in particular for chemistry students, is a pressing, but hardly a simple task. In this paper, we discuss the relevance of taking into account lessons from the history of the discipline and the ongoing controversy over its interpretations and…

  17. Quantum mechanics

    International Nuclear Information System (INIS)

    Basdevant, J.L.; Dalibard, J.; Joffre, M.

    2008-01-01

    All physics is quantum from elementary particles to stars and to the big-bang via semi-conductors and chemistry. This theory is very subtle and we are not able to explain it without the help of mathematic tools. This book presents the principles of quantum mechanics and describes its mathematical formalism (wave function, Schroedinger equation, quantum operators, spin, Hamiltonians, collisions,..). We find numerous applications in the fields of new technologies (maser, quantum computer, cryptography,..) and in astrophysics. A series of about 90 exercises with their answers is included. This book is based on a physics course at a graduate level. (A.C.)

  18. Quantum physics for beginners

    CERN Document Server

    Ficek, Zbigniew

    2016-01-01

    The textbook introduces students to the main ideas of quantum physics and the basic mathematical methods and techniques used in the fields of advanced quantum physics, atomic physics, laser physics, nanotechnology, quantum chemistry, and theoretical mathematics. The textbook explains how microscopic objects (particles) behave in unusual ways, giving rise to what's called quantum effects. It contains a wide range of tutorial problems from simple confidence-builders to fairly challenging exercises that provide adequate understanding of the basic concepts of quantum physics.

  19. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach

    Science.gov (United States)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Duvenaud, David; MacLaurin, Dougal; Blood-Forsythe, Martin A.; Chae, Hyun Sik; Einzinger, Markus; Ha, Dong-Gwang; Wu, Tony; Markopoulos, Georgios; Jeon, Soonok; Kang, Hosuk; Miyazaki, Hiroshi; Numata, Masaki; Kim, Sunghan; Huang, Wenliang; Hong, Seong Ik; Baldo, Marc; Adams, Ryan P.; Aspuru-Guzik, Alán

    2016-10-01

    Virtual screening is becoming a ground-breaking tool for molecular discovery due to the exponential growth of available computer time and constant improvement of simulation and machine learning techniques. We report an integrated organic functional material design process that incorporates theoretical insight, quantum chemistry, cheminformatics, machine learning, industrial expertise, organic synthesis, molecular characterization, device fabrication and optoelectronic testing. After exploring a search space of 1.6 million molecules and screening over 400,000 of them using time-dependent density functional theory, we identified thousands of promising novel organic light-emitting diode molecules across the visible spectrum. Our team collaboratively selected the best candidates from this set. The experimentally determined external quantum efficiencies for these synthesized candidates were as large as 22%.

  20. Laboratory study of nitrate photolysis in Antarctic snow. I. Observed quantum yield, domain of photolysis, and secondary chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Meusinger, Carl; Johnson, Matthew S. [Department of Chemistry, University of Copenhagen, Copenhagen (Denmark); Berhanu, Tesfaye A.; Erbland, Joseph; Savarino, Joel, E-mail: jsavarino@lgge.obs.ujf-grenoble.fr [Univ. Grenoble Alpes, LGGE, F-38000 Grenoble (France); CNRS, LGGE, F-38000 Grenoble (France)

    2014-06-28

    Post-depositional processes alter nitrate concentration and nitrate isotopic composition in the top layers of snow at sites with low snow accumulation rates, such as Dome C, Antarctica. Available nitrate ice core records can provide input for studying past atmospheres and climate if such processes are understood. It has been shown that photolysis of nitrate in the snowpack plays a major role in nitrate loss and that the photolysis products have a significant influence on the local troposphere as well as on other species in the snow. Reported quantum yields for the main reaction spans orders of magnitude – apparently a result of whether nitrate is located at the air-ice interface or in the ice matrix – constituting the largest uncertainty in models of snowpack NO{sub x} emissions. Here, a laboratory study is presented that uses snow from Dome C and minimizes effects of desorption and recombination by flushing the snow during irradiation with UV light. A selection of UV filters allowed examination of the effects of the 200 and 305 nm absorption bands of nitrate. Nitrate concentration and photon flux were measured in the snow. The quantum yield for loss of nitrate was observed to decrease from 0.44 to 0.003 within what corresponds to days of UV exposure in Antarctica. The superposition of photolysis in two photochemical domains of nitrate in snow is proposed: one of photolabile nitrate, and one of buried nitrate. The difference lies in the ability of reaction products to escape the snow crystal, versus undergoing secondary (recombination) chemistry. Modeled NO{sub x} emissions may increase significantly above measured values due to the observed quantum yield in this study. The apparent quantum yield in the 200 nm band was found to be ∼1%, much lower than reported for aqueous chemistry. A companion paper presents an analysis of the change in isotopic composition of snowpack nitrate based on the same samples as in this study.

  1. High-Capacity Quantum Secure Direct Communication Based on Quantum Hyperdense Coding with Hyperentanglement

    International Nuclear Information System (INIS)

    Wang Tie-Jun; Li Tao; Du Fang-Fang; Deng Fu-Guo

    2011-01-01

    We present a quantum hyperdense coding protocol with hyperentanglement in polarization and spatial-mode degrees of freedom of photons first and then give the details for a quantum secure direct communication (QSDC) protocol based on this quantum hyperdense coding protocol. This QSDC protocol has the advantage of having a higher capacity than the quantum communication protocols with a qubit system. Compared with the QSDC protocol based on superdense coding with d-dimensional systems, this QSDC protocol is more feasible as the preparation of a high-dimension quantum system is more difficult than that of a two-level quantum system at present. (general)

  2. High-throughput investigation of polymerization kinetics by online monitoring of GPC and GC

    NARCIS (Netherlands)

    Hoogenboom, R.; Fijten, M.W.M.; Abeln, C.H.; Schubert, U.S.

    2004-01-01

    Gel permeation chromatography (GPC) and gas chromatography (GC) were successfully introduced into a high-throughput workflow. The feasibility and limitations of online GPC with a high-speed column was evaluated by measuring polystyrene standards and comparison of the results with regular offline GPC

  3. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  4. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  5. Applications of high-throughput sequencing to chromatin structure and function in mammals

    OpenAIRE

    Dunham, Ian

    2009-01-01

    High-throughput DNA sequencing approaches have enabled direct interrogation of chromatin samples from mammalian cells. We are beginning to develop a genome-wide description of nuclear function during development, but further data collection, refinement, and integration are needed.

  6. Low-Cost, High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI

    Science.gov (United States)

    2017-10-01

    low- cost and high-throughput was a key element proposed for this project, which we believe will be of significant benefit to the patients suffering...Award Number: W81XWH-15-1-0272 TITLE: Low- Cost , High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI...STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the author(s

  7. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    OpenAIRE

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-01-01

    Abstract The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated p...

  8. From Classical to High Throughput Screening Methods for Feruloyl Esterases: A Review.

    Science.gov (United States)

    Ramírez-Velasco, Lorena; Armendáriz-Ruiz, Mariana; Rodríguez-González, Jorge Alberto; Müller-Santos, Marcelo; Asaff-Torres, Ali; Mateos-Díaz, Juan Carlos

    2016-01-01

    Feruloyl esterases (FAEs) are a diverse group of hydrolases widely distributed in plants and microorganisms which catalyzes the cleavage and formation of ester bonds between plant cell wall polysaccharides and phenolic acids. FAEs have gained importance in biofuel, medicine and food industries due to their capability of acting on a large range of substrates for cleaving ester bonds and synthesizing highadded value molecules through esterification and transesterification reactions. During the past two decades extensive studies have been carried out on the production, characterization and classification of FAEs, however only a few reports of suitable High Throughput Screening assays for this kind of enzymes have been reported. This review is focused on a concise but complete revision of classical to High Throughput Screening methods for FAEs, highlighting its advantages and disadvantages, and finally suggesting future perspectives for this important research field.

  9. Towards realising high-speed large-bandwidth quantum memory

    Institute of Scientific and Technical Information of China (English)

    SHI BaoSen; DING DongSheng

    2016-01-01

    Indispensable for quantum communication and quantum computation,quantum memory executes on demand storage and retrieval of quantum states such as those of a single photon,an entangled pair or squeezed states.Among the various forms of quantum memory,Raman quantum memory has advantages forits broadband and high-speed characteristics,which results in a huge potential for applications in quantum networks and quantum computation.However,realising Raman quantum memory with true single photons and photonic entanglementis challenging.In this review,after briefly introducing the main benchmarks in the development of quantum memory and describing the state of the art,we focus on our recent experimental progress inquantum memorystorage of quantum states using the Raman scheme.

  10. Advances in high temperature chemistry

    CERN Document Server

    Eyring, Leroy

    1969-01-01

    Advances in High Temperature Chemistry, Volume 2 covers the advances in the knowledge of the high temperature behavior of materials and the complex and unfamiliar characteristics of matter at high temperature. The book discusses the dissociation energies and free energy functions of gaseous monoxides; the matrix-isolation technique applied to high temperature molecules; and the main features, the techniques for the production, detection, and diagnosis, and the applications of molecular beams in high temperatures. The text also describes the chemical research in streaming thermal plasmas, as w

  11. Writing Chemistry Jingles as an Introductory Activity in a High School Chemistry Class

    Science.gov (United States)

    Heid, Peter F.

    2011-01-01

    Starting the school year in an introductory high school chemistry class can be a challenge. The topic and approach is new to the students; many of the early chapters in the texts can be a bit tedious; and for many students the activities are uninspiring. My goal in the first few weeks of school is to hook the students on chemistry by getting them…

  12. AELAS: Automatic ELAStic property derivations via high-throughput first-principles computation

    Science.gov (United States)

    Zhang, S. H.; Zhang, R. F.

    2017-11-01

    The elastic properties are fundamental and important for crystalline materials as they relate to other mechanical properties, various thermodynamic qualities as well as some critical physical properties. However, a complete set of experimentally determined elastic properties is only available for a small subset of known materials, and an automatic scheme for the derivations of elastic properties that is adapted to high-throughput computation is much demanding. In this paper, we present the AELAS code, an automated program for calculating second-order elastic constants of both two-dimensional and three-dimensional single crystal materials with any symmetry, which is designed mainly for high-throughput first-principles computation. Other derivations of general elastic properties such as Young's, bulk and shear moduli as well as Poisson's ratio of polycrystal materials, Pugh ratio, Cauchy pressure, elastic anisotropy and elastic stability criterion, are also implemented in this code. The implementation of the code has been critically validated by a lot of evaluations and tests on a broad class of materials including two-dimensional and three-dimensional materials, providing its efficiency and capability for high-throughput screening of specific materials with targeted mechanical properties. Program Files doi:http://dx.doi.org/10.17632/f8fwg4j9tw.1 Licensing provisions: BSD 3-Clause Programming language: Fortran Nature of problem: To automate the calculations of second-order elastic constants and the derivations of other elastic properties for two-dimensional and three-dimensional materials with any symmetry via high-throughput first-principles computation. Solution method: The space-group number is firstly determined by the SPGLIB code [1] and the structure is then redefined to unit cell with IEEE-format [2]. Secondly, based on the determined space group number, a set of distortion modes is automatically specified and the distorted structure files are generated

  13. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H

    2017-01-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analy...

  14. High-throughput screening of small-molecule adsorption in MOF-74

    Science.gov (United States)

    Thonhauser, T.; Canepa, P.

    2014-03-01

    Using high-throughput screening coupled with state-of-the-art van der Waals density functional theory, we investigate the adsorption properties of four important molecules, H2, CO2, CH4, and H2O in MOF-74-  with  = Be, Mg, Al, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr, Zr, Nb, Ru, Rh, Pd, La, W, Os, Ir, and Pt. We show that high-throughput techniques can aid in speeding up the development and refinement of effective materials for hydrogen storage, carbon capture, and gas separation. The exploration of the configurational adsorption space allows us to extract crucial information concerning, for example, the competition of water with CO2 for the adsorption binding sites. We find that only a few noble metals--Rh, Pd, Os, Ir, and Pt--favor the adsorption of CO2 and hence are potential candidates for effective carbon-capture materials. Our findings further reveal significant differences in the binding characteristics of H2, CO2, CH4, and H2O within the MOF structure, indicating that molecular blends can be successfully separated by these nano-porous materials. Supported by DOE DE-FG02-08ER46491.

  15. Quantum information and computation for chemistry

    CERN Document Server

    Kais, Sabre; Rice, Stuart A

    2014-01-01

    Examines the intersection of quantum information and chemical physics The Advances in Chemical Physics series is dedicated to reviewing new and emerging topics as well as the latest developments in traditional areas of study in the field of chemical physics. Each volume features detailed comprehensive analyses coupled with individual points of view that integrate the many disciplines of science that are needed for a full understanding of chemical physics. This volume of the series explores the latest research findings, applications, and new research paths from the quantum information science

  16. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  17. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    Science.gov (United States)

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  18. A quantum-chemical perspective into low optical-gap polymers for highly-efficient organic solar cells

    KAUST Repository

    Risko, Chad

    2011-03-15

    The recent and rapid enhancement in power conversion efficiencies of organic-based, bulk heterojunction solar cells has been a consequence of both improved materials design and better understanding of the underlying physical processes involved in photocurrent generation. In this Perspective, we first present an overview of the application of quantum-chemical techniques to study the intrinsic material properties and molecular- and nano-scale processes involved in device operation. In the second part, these quantum-chemical tools are applied to an oligomer-based study on a collection of donor-acceptor copolymers that have been used in the highest-efficiency solar cell devices reported to date. The quantum-chemical results are found to be in good agreement with the empirical data related to the electronic and optical properties. In particular, they provide insight into the natures of the electronic excitations responsible for the near-infrared/visible absorption profiles, as well as into the energetics of the low-lying singlet and triplet states. These results lead to a better understanding of the inherent differences among the materials, and highlight the usefulness of quantum chemistry as an instrument for material design. Importantly, the results also point to the need to continue the development of integrated, multi scale modeling approaches to provide a thorough understanding of the materials properties. © The Royal Society of Chemistry 2011.

  19. Novel Acoustic Loading of a Mass Spectrometer: Toward Next-Generation High-Throughput MS Screening.

    Science.gov (United States)

    Sinclair, Ian; Stearns, Rick; Pringle, Steven; Wingfield, Jonathan; Datwani, Sammy; Hall, Eric; Ghislain, Luke; Majlof, Lars; Bachman, Martin

    2016-02-01

    High-throughput, direct measurement of substrate-to-product conversion by label-free detection, without the need for engineered substrates or secondary assays, could be considered the "holy grail" of drug discovery screening. Mass spectrometry (MS) has the potential to be part of this ultimate screening solution, but is constrained by the limitations of existing MS sample introduction modes that cannot meet the throughput requirements of high-throughput screening (HTS). Here we report data from a prototype system (Echo-MS) that uses acoustic droplet ejection (ADE) to transfer femtoliter-scale droplets in a rapid, precise, and accurate fashion directly into the MS. The acoustic source can load samples into the MS from a microtiter plate at a rate of up to three samples per second. The resulting MS signal displays a very sharp attack profile and ions are detected within 50 ms of activation of the acoustic transducer. Additionally, we show that the system is capable of generating multiply charged ion species from simple peptides and large proteins. The combination of high speed and low sample volume has significant potential within not only drug discovery, but also other areas of the industry. © 2015 Society for Laboratory Automation and Screening.

  20. Novel method for the high-throughput processing of slides for the comet assay.

    Science.gov (United States)

    Karbaschi, Mahsa; Cooke, Marcus S

    2014-11-26

    Single cell gel electrophoresis (the comet assay), continues to gain popularity as a means of assessing DNA damage. However, the assay's low sample throughput and laborious sample workup procedure are limiting factors to its application. "Scoring", or individually determining DNA damage levels in 50 cells per treatment, is time-consuming, but with the advent of high-throughput scoring, the limitation is now the ability to process significant numbers of comet slides. We have developed a novel method by which multiple slides may be manipulated, and undergo electrophoresis, in batches of 25 rather than individually and, importantly, retains the use of standard microscope comet slides, which are the assay convention. This decreases assay time by 60%, and benefits from an electrophoresis tank with a substantially smaller footprint, and more uniform orientation of gels during electrophoresis. Our high-throughput variant of the comet assay greatly increases the number of samples analysed, decreases assay time, number of individual slide manipulations, reagent requirements and risk of damage to slides. The compact nature of the electrophoresis tank is of particular benefit to laboratories where bench space is at a premium. This novel approach is a significant advance on the current comet assay procedure.

  1. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  2. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  3. High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials

    Science.gov (United States)

    United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...

  4. Role of Precursor-Conversion Chemistry in the Crystal-Phase Control of Catalytically Grown Colloidal Semiconductor Quantum Wires.

    Science.gov (United States)

    Wang, Fudong; Buhro, William E

    2017-12-26

    Crystal-phase control is one of the most challenging problems in nanowire growth. We demonstrate that, in the solution-phase catalyzed growth of colloidal cadmium telluride (CdTe) quantum wires (QWs), the crystal phase can be controlled by manipulating the reaction chemistry of the Cd precursors and tri-n-octylphosphine telluride (TOPTe) to favor the production of either a CdTe solute or Te, which consequently determines the composition and (liquid or solid) state of the Bi x Cd y Te z catalyst nanoparticles. Growth of single-phase (e.g., wurtzite) QWs is achieved only from solid catalysts (y ≪ z) that enable the solution-solid-solid growth of the QWs, whereas the liquid catalysts (y ≈ z) fulfill the solution-liquid-solid growth of the polytypic QWs. Factors that affect the precursor-conversion chemistry are systematically accounted for, which are correlated with a kinetic study of the composition and state of the catalyst nanoparticles to understand the mechanism. This work reveals the role of the precursor-reaction chemistry in the crystal-phase control of catalytically grown colloidal QWs, opening the possibility of growing phase-pure QWs of other compositions.

  5. Recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers

    DEFF Research Database (Denmark)

    Andersen, Lars Dyrskjøt; Zieger, Karsten; Ørntoft, Torben Falck

    2007-01-01

    individually contributed to the management of the disease. However, the development of high-throughput techniques for simultaneous assessment of a large number of markers has allowed classification of tumors into clinically relevant molecular subgroups beyond those possible by pathological classification. Here......Bladder cancer is the fifth most common neoplasm in industrialized countries. Due to frequent recurrences of the superficial form of this disease, bladder cancer ranks as one of the most common cancers. Despite the description of a large number of tumor markers for bladder cancers, none have......, we review the recent advances in high-throughput molecular marker identification for superficial and invasive bladder cancers....

  6. High-throughput screening of ionic conductivity in polymer membranes

    International Nuclear Information System (INIS)

    Zapata, Pedro; Basak, Pratyay; Carson Meredith, J.

    2009-01-01

    Combinatorial and high-throughput techniques have been successfully used for efficient and rapid property screening in multiple fields. The use of these techniques can be an advantageous new approach to assay ionic conductivity and accelerate the development of novel materials in research areas such as fuel cells. A high-throughput ionic conductivity (HTC) apparatus is described and applied to screening candidate polymer electrolyte membranes for fuel cell applications. The device uses a miniature four-point probe for rapid, automated point-to-point AC electrochemical impedance measurements in both liquid and humid air environments. The conductivity of Nafion 112 HTC validation standards was within 1.8% of the manufacturer's specification. HTC screening of 40 novel Kynar poly(vinylidene fluoride) (PVDF)/acrylic polyelectrolyte (PE) membranes focused on varying the Kynar type (5x) and PE composition (8x) using reduced sample sizes. Two factors were found to be significant in determining the proton conducting capacity: (1) Kynar PVDF series: membranes containing a particular Kynar PVDF type exhibited statistically identical mean conductivity as other membranes containing different Kynar PVDF types that belong to the same series or family. (2) Maximum effective amount of polyelectrolyte: increments in polyelectrolyte content from 55 wt% to 60 wt% showed no statistically significant effect in increasing conductivity. In fact, some membranes experienced a reduction in conductivity.

  7. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    OpenAIRE

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gen...

  8. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  9. Trapping shape-controlled nanoparticle nucleation and growth stages via continuous-flow chemistry.

    Science.gov (United States)

    LaGrow, Alec P; Besong, Tabot M D; AlYami, Noktan M; Katsiev, Khabiboulakh; Anjum, Dalaver H; Abdelkader, Ahmed; Costa, Pedro M F J; Burlakov, Victor M; Goriely, Alain; Bakr, Osman M

    2017-02-21

    Continuous flow chemistry is used to trap the nucleation and growth stages of platinum-nickel nano-octahedra with second time resolution and high throughputs to probe their properties ex situ. The growth starts from poorly crystalline particles (nucleation) at 5 seconds, to crystalline 1.5 nm particles bounded by the {111}-facets at 7.5 seconds, followed by truncation and further growth to octahedral nanoparticles at 20 seconds.

  10. Trapping shape-controlled nanoparticle nucleation and growth stages via continuous-flow chemistry

    KAUST Repository

    LaGrow, Alec P.; Besong, Tabot M.D.; AlYami, Noktan; Katsiev, Khabiboulakh; Anjum, Dalaver H.; Abdelkader, Ahmed; Da Costa, Pedro M. F. J.; Burlakov, Victor M.; Goriely, Alain; Bakr, Osman

    2017-01-01

    Continuous flow chemistry is used to trap the nucleation and growth stages of platinum-nickel nano-octahedra with second time resolution and high throughputs to probe their properties ex situ. The growth starts from poorly crystalline particles (nucleation) at 5 seconds, to crystalline 1.5 nm particles bounded by the {111}-facets at 7.5 seconds, followed by truncation and further growth to octahedral nanoparticles at 20 seconds.

  11. Trapping shape-controlled nanoparticle nucleation and growth stages via continuous-flow chemistry

    KAUST Repository

    LaGrow, Alec P.

    2017-02-06

    Continuous flow chemistry is used to trap the nucleation and growth stages of platinum-nickel nano-octahedra with second time resolution and high throughputs to probe their properties ex situ. The growth starts from poorly crystalline particles (nucleation) at 5 seconds, to crystalline 1.5 nm particles bounded by the {111}-facets at 7.5 seconds, followed by truncation and further growth to octahedral nanoparticles at 20 seconds.

  12. High Resolution Melting (HRM for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Directory of Open Access Journals (Sweden)

    Marcin Słomka

    2017-11-01

    Full Text Available High resolution melting (HRM is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs. This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  13. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  14. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  15. Identifying Inhibitors of Inflammation: A Novel High-Throughput MALDI-TOF Screening Assay for Salt-Inducible Kinases (SIKs).

    Science.gov (United States)

    Heap, Rachel E; Hope, Anthony G; Pearson, Lesley-Anne; Reyskens, Kathleen M S E; McElroy, Stuart P; Hastie, C James; Porter, David W; Arthur, J Simon C; Gray, David W; Trost, Matthias

    2017-12-01

    Matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) mass spectrometry has become a promising alternative for high-throughput drug discovery as new instruments offer high speed, flexibility and sensitivity, and the ability to measure physiological substrates label free. Here we developed and applied high-throughput MALDI TOF mass spectrometry to identify inhibitors of the salt-inducible kinase (SIK) family, which are interesting drug targets in the field of inflammatory disease as they control production of the anti-inflammatory cytokine interleukin-10 (IL-10) in macrophages. Using peptide substrates in in vitro kinase assays, we can show that hit identification of the MALDI TOF kinase assay correlates with indirect ADP-Hunter kinase assays. Moreover, we can show that both techniques generate comparable IC 50 data for a number of hit compounds and known inhibitors of SIK kinases. We further take these inhibitors to a fluorescence-based cellular assay using the SIK activity-dependent translocation of CRTC3 into the nucleus, thereby providing a complete assay pipeline for the identification of SIK kinase inhibitors in vitro and in cells. Our data demonstrate that MALDI TOF mass spectrometry is fully applicable to high-throughput kinase screening, providing label-free data comparable to that of current high-throughput fluorescence assays.

  16. High-throughput peptide mass fingerprinting and protein macroarray analysis using chemical printing strategies

    International Nuclear Information System (INIS)

    Sloane, A.J.; Duff, J.L.; Hopwood, F.G.; Wilson, N.L.; Smith, P.E.; Hill, C.J.; Packer, N.H.; Williams, K.L.; Gooley, A.A.; Cole, R.A.; Cooley, P.W.; Wallace, D.B.

    2001-01-01

    We describe a 'chemical printer' that uses piezoelectric pulsing for rapid and accurate microdispensing of picolitre volumes of fluid for proteomic analysis of 'protein macroarrays'. Unlike positive transfer and pin transfer systems, our printer dispenses fluid in a non-contact process that ensures that the fluid source cannot be contaminated by substrate during a printing event. We demonstrate automated delivery of enzyme and matrix solutions for on-membrane protein digestion and subsequent peptide mass fingerprinting (pmf) analysis directly from the membrane surface using matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). This approach bypasses the more commonly used multi-step procedures, thereby permitting a more rapid procedure for protein identification. We also highlight the advantage of printing different chemistries onto an individual protein spot for multiple microscale analyses. This ability is particularly useful when detailed characterisation of rare and valuable sample is required. Using a combination of PNGase F and trypsin we have mapped sites of N-glycosylation using on-membrane digestion strategies. We also demonstrate the ability to print multiple serum samples in a micro-ELISA format and rapidly screen a protein macroarray of human blood plasma for pathogen-derived antigens. We anticipate that the 'chemical printer' will be a major component of proteomic platforms for high-throughput protein identification and characterisation with widespread applications in biomedical and diagnostic discovery

  17. Label-free detection of cellular drug responses by high-throughput bright-field imaging and machine learning.

    Science.gov (United States)

    Kobayashi, Hirofumi; Lei, Cheng; Wu, Yi; Mao, Ailin; Jiang, Yiyue; Guo, Baoshan; Ozeki, Yasuyuki; Goda, Keisuke

    2017-09-29

    In the last decade, high-content screening based on multivariate single-cell imaging has been proven effective in drug discovery to evaluate drug-induced phenotypic variations. Unfortunately, this method inherently requires fluorescent labeling which has several drawbacks. Here we present a label-free method for evaluating cellular drug responses only by high-throughput bright-field imaging with the aid of machine learning algorithms. Specifically, we performed high-throughput bright-field imaging of numerous drug-treated and -untreated cells (N = ~240,000) by optofluidic time-stretch microscopy with high throughput up to 10,000 cells/s and applied machine learning to the cell images to identify their morphological variations which are too subtle for human eyes to detect. Consequently, we achieved a high accuracy of 92% in distinguishing drug-treated and -untreated cells without the need for labeling. Furthermore, we also demonstrated that dose-dependent, drug-induced morphological change from different experiments can be inferred from the classification accuracy of a single classification model. Our work lays the groundwork for label-free drug screening in pharmaceutical science and industry.

  18. A high throughput architecture for a low complexity soft-output demapping algorithm

    Science.gov (United States)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  19. Multilayer Porous Crucibles for the High Throughput Salt Separation from Uranium Deposits

    International Nuclear Information System (INIS)

    Kwon, S. W.; Park, K. M.; Kim, J. G.; Kim, I. T.; Seo, B. K.; Moon, J. G.

    2013-01-01

    Solid cathode processing is necessary to separate the salt from the cathode since the uranium deposit in a solid cathode contains electrolyte salt. A physical separation process, such as a distillation separation, is more attractive than a chemical or dissolution process because physical processes generate much less secondary process. Distillation process was employed for the cathode processsing due to the advantages of minimal generation of secondary waste, compact unit process, simple and low cost equipment. The basis for vacuum distillation separation is the difference in vapor pressures between salt and uranium. A solid cathode deposit is heated in a heating region and salt vaporizes, while nonvolatile uranium remains behind. It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites. The evaporation rate of the LiCl-KCl eutectic salt in vacuum distiller is not so high to come up with the generation capacity of uranium dendrites in an electro-refiner. Therefore, a wide evaporation area or high distillation temperature is necessary for the successful salt separation. In this study, it was attempted to enlarge a throughput of the salt distiller with a multilayer porous crucibles for the separation of adhered salt in the uranium deposits generated from the electrorefiner. The feasibility of the porous crucibles was tested by the salt distillation experiments. In this study, the salt distiller with multilayer porous crucibles was proposed and the feasibility of liquid salt separation was examined to increase a throughput. It was found that the effective separation of salt from uranium deposits was possible by the multilayer porous crucibles

  20. Multilayer Porous Crucibles for the High Throughput Salt Separation from Uranium Deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S. W.; Park, K. M.; Kim, J. G.; Kim, I. T.; Seo, B. K.; Moon, J. G. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    Solid cathode processing is necessary to separate the salt from the cathode since the uranium deposit in a solid cathode contains electrolyte salt. A physical separation process, such as a distillation separation, is more attractive than a chemical or dissolution process because physical processes generate much less secondary process. Distillation process was employed for the cathode processsing due to the advantages of minimal generation of secondary waste, compact unit process, simple and low cost equipment. The basis for vacuum distillation separation is the difference in vapor pressures between salt and uranium. A solid cathode deposit is heated in a heating region and salt vaporizes, while nonvolatile uranium remains behind. It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites. The evaporation rate of the LiCl-KCl eutectic salt in vacuum distiller is not so high to come up with the generation capacity of uranium dendrites in an electro-refiner. Therefore, a wide evaporation area or high distillation temperature is necessary for the successful salt separation. In this study, it was attempted to enlarge a throughput of the salt distiller with a multilayer porous crucibles for the separation of adhered salt in the uranium deposits generated from the electrorefiner. The feasibility of the porous crucibles was tested by the salt distillation experiments. In this study, the salt distiller with multilayer porous crucibles was proposed and the feasibility of liquid salt separation was examined to increase a throughput. It was found that the effective separation of salt from uranium deposits was possible by the multilayer porous crucibles.

  1. Retrofit Strategies for Incorporating Xenobiotic Metabolism into High Throughput Screening Assays (EMGS)

    Science.gov (United States)

    The US EPA’s ToxCast program is designed to assess chemical perturbations of molecular and cellular endpoints using a variety of high-throughput screening (HTS) assays. However, existing HTS assays have limited or no xenobiotic metabolism which could lead to a mischaracterization...

  2. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  3. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  4. High precision optical spectroscopy and quantum state selected photodissociation of ultracold 88Sr2 molecules in an optical lattice

    Science.gov (United States)

    McDonald, Mickey

    2017-04-01

    Over the past several decades, rapid progress has been made toward the accurate characterization and control of atoms, epitomized by the ever-increasing accuracy and precision of optical atomic lattice clocks. Extending this progress to molecules will have exciting implications for chemistry, condensed matter physics, and precision tests of physics beyond the Standard Model. My thesis describes work performed over the past six years to establish the state of the art in manipulation and quantum control of ultracold molecules. We describe a thorough set of measurements characterizing the rovibrational structure of weakly bound 88Sr2 molecules from several different perspectives, including determinations of binding energies; linear, quadratic, and higher order Zeeman shifts; transition strengths between bound states; and lifetimes of narrow subradiant states. Finally, we discuss measurements of photofragment angular distributions produced by photodissociation of molecules in single quantum states, leading to an exploration of quantum-state-resolved ultracold chemistry. The images of exploding photofragments produced in these studies exhibit dramatic interference effects and strongly violate semiclassical predictions, instead requiring a fully quantum mechanical description.

  5. High-throughput gene expression profiling of memory differentiation in primary human T cells

    Directory of Open Access Journals (Sweden)

    Russell Kate

    2008-08-01

    Full Text Available Abstract Background The differentiation of naive T and B cells into memory lymphocytes is essential for immunity to pathogens. Therapeutic manipulation of this cellular differentiation program could improve vaccine efficacy and the in vitro expansion of memory cells. However, chemical screens to identify compounds that induce memory differentiation have been limited by 1 the lack of reporter-gene or functional assays that can distinguish naive and memory-phenotype T cells at high throughput and 2 a suitable cell-line representative of naive T cells. Results Here, we describe a method for gene-expression based screening that allows primary naive and memory-phenotype lymphocytes to be discriminated based on complex genes signatures corresponding to these differentiation states. We used ligation-mediated amplification and a fluorescent, bead-based detection system to quantify simultaneously 55 transcripts representing naive and memory-phenotype signatures in purified populations of human T cells. The use of a multi-gene panel allowed better resolution than any constituent single gene. The method was precise, correlated well with Affymetrix microarray data, and could be easily scaled up for high-throughput. Conclusion This method provides a generic solution for high-throughput differentiation screens in primary human T cells where no single-gene or functional assay is available. This screening platform will allow the identification of small molecules, genes or soluble factors that direct memory differentiation in naive human lymphocytes.

  6. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  7. General chemistry

    International Nuclear Information System (INIS)

    Kwon, Yeong Sik; Lee, Dong Seop; Ryu, Haung Ryong; Jang, Cheol Hyeon; Choi, Bong Jong; Choi, Sang Won

    1993-07-01

    The book concentrates on the latest general chemistry, which is divided int twenty-three chapters. It deals with basic conception and stoichiometry, nature of gas, structure of atoms, quantum mechanics, symbol and structure of an electron of ion and molecule, chemical thermodynamics, nature of solid, change of state and liquid, properties of solution, chemical equilibrium, solution and acid-base, equilibrium of aqueous solution, electrochemistry, chemical reaction speed, molecule spectroscopy, hydrogen, oxygen and water, metallic atom; 1A, IIA, IIIA, carbon and atom IVA, nonmetal atom and an inert gas, transition metals, lanthanons, and actinoids, nuclear properties and radioactivity, biochemistry and environment chemistry.

  8. High-Throughput Process Development for Biopharmaceuticals.

    Science.gov (United States)

    Shukla, Abhinav A; Rameez, Shahid; Wolfe, Leslie S; Oien, Nathan

    2017-11-14

    The ability to conduct multiple experiments in parallel significantly reduces the time that it takes to develop a manufacturing process for a biopharmaceutical. This is particularly significant before clinical entry, because process development and manufacturing are on the "critical path" for a drug candidate to enter clinical development. High-throughput process development (HTPD) methodologies can be similarly impactful during late-stage development, both for developing the final commercial process as well as for process characterization and scale-down validation activities that form a key component of the licensure filing package. This review examines the current state of the art for HTPD methodologies as they apply to cell culture, downstream purification, and analytical techniques. In addition, we provide a vision of how HTPD activities across all of these spaces can integrate to create a rapid process development engine that can accelerate biopharmaceutical drug development. Graphical Abstract.

  9. High Throughput Line-of-Sight MIMO Systems for Next Generation Backhaul Applications

    Science.gov (United States)

    Song, Xiaohang; Cvetkovski, Darko; Hälsig, Tim; Rave, Wolfgang; Fettweis, Gerhard; Grass, Eckhard; Lankl, Berthold

    2017-09-01

    The evolution to ultra-dense next generation networks requires a massive increase in throughput and deployment flexibility. Therefore, novel wireless backhaul solutions that can support these demands are needed. In this work we present an approach for a millimeter wave line-of-sight MIMO backhaul design, targeting transmission rates in the order of 100 Gbit/s. We provide theoretical foundations for the concept showcasing its potential, which are confirmed through channel measurements. Furthermore, we provide insights into the system design with respect to antenna array setup, baseband processing, synchronization, and channel equalization. Implementation in a 60 GHz demonstrator setup proves the feasibility of the system concept for high throughput backhauling in next generation networks.

  10. Application of the CRAY-1 for quantum chemistry calculations

    International Nuclear Information System (INIS)

    Saunders, V.R.; Guest, M.F.

    1982-01-01

    The following steps in a typical quantum chemistry calculation will be considered: 1. Gaussian integrals evaluation. 2. Hartree-Fock computation of an uncorrelated wavefunction. 3. 4-index transformation of two-electron integrals. 4. Configuration interaction calculations of a correlated wavefunction. In all the above steps we have found that algorithms may be devised which formulate the problem as being dominated by a series of matrix multiplications: R=AB, where A (or B) is sparse. A routine for performing the sparse matrix multiply has been prepared with a maximum measured performance of 147 M flops. When this routine is used in our applications packages, overall performance of approximately 50, 100 and 120 M flops are observed for steps 1, 3 and 4, respectively. The result in step 2 is not so successful, as effective implementation of the matrix multiplication requires efficient performance of data gather and scatter sequences (not vectorisable on the CRAY-1), and a performance of 10 M flops is observed. The importance of gather/scatter sequences in such operations as file sorting is pointed out. The present performance is compared with that previously obtained on CDC 7600 equipment and from this data we deduce the cost-effectiveness of the CRAY-1 in our field. (orig.)

  11. Green wet chemical route to synthesize capped CdSe quantum dots

    Indian Academy of Sciences (India)

    In the present work, we report green synthesis of tartaric acid (TA) and triethanolamine (TEA) capped ... CdSe quantum dots; chemical bath deposition; capping; green chemistry; nanomaterials. 1. .... at high concentration of nanoparticles.

  12. Operational evaluation of high-throughput community-based mass prophylaxis using Just-in-time training.

    Science.gov (United States)

    Spitzer, James D; Hupert, Nathaniel; Duckart, Jonathan; Xiong, Wei

    2007-01-01

    Community-based mass prophylaxis is a core public health operational competency, but staffing needs may overwhelm the local trained health workforce. Just-in-time (JIT) training of emergency staff and computer modeling of workforce requirements represent two complementary approaches to address this logistical problem. Multnomah County, Oregon, conducted a high-throughput point of dispensing (POD) exercise to test JIT training and computer modeling to validate POD staffing estimates. The POD had 84% non-health-care worker staff and processed 500 patients per hour. Post-exercise modeling replicated observed staff utilization levels and queue formation, including development and amelioration of a large medical evaluation queue caused by lengthy processing times and understaffing in the first half-hour of the exercise. The exercise confirmed the feasibility of using JIT training for high-throughput antibiotic dispensing clinics staffed largely by nonmedical professionals. Patient processing times varied over the course of the exercise, with important implications for both staff reallocation and future POD modeling efforts. Overall underutilization of staff revealed the opportunity for greater efficiencies and even higher future throughputs.

  13. Risk-based high-throughput chemical screening and prioritization using exposure models and in vitro bioactivity assays

    DEFF Research Database (Denmark)

    Shin, Hyeong-Moo; Ernstoff, Alexi; Arnot, Jon

    2015-01-01

    We present a risk-based high-throughput screening (HTS) method to identify chemicals for potential health concerns or for which additional information is needed. The method is applied to 180 organic chemicals as a case study. We first obtain information on how the chemical is used and identify....../oral contact, or dermal exposure. The method provides high-throughput estimates of exposure and important input for decision makers to identify chemicals of concern for further evaluation with additional information or more refined models....

  14. Raman spectroscopy in high temperature chemistry

    International Nuclear Information System (INIS)

    Drake, M.C.; Rosenblatt, G.M.

    1979-01-01

    Raman spectroscopy (largely because of advances in laser and detector technology) is assuming a rapidly expanding role in many areas of research. This paper reviews the contribution of Raman spectroscopy in high temperature chemistry including molecular spectroscopy on static systems and gas diagnostic measurements on reactive systems. An important aspect of high temperature chemistry has been the identification and study of the new, and often unusual, gaseous molecules which form at high temperatures. Particularly important is the investigation of vibrational-rotational energy levels and electronic states which determine thermodynamic properties and describe chemical bonding. Some advantages and disadvantages of high temperature Raman spectrosocpy for molecular studies on static systems are compared: (1) Raman vs infrared; (2) gas-phase vs condensed in matries; and (3) atmospheric pressure Raman vs low pressure techniques, including mass spectroscopy, matrix isolation, and molecular beams. Raman studies on molecular properties of gases, melts, and surfaces are presented with emphasis on work not covered in previous reviews of high temperature and matrix isolation Raman spectroscopy

  15. Raman spectroscopy in high temperature chemistry

    International Nuclear Information System (INIS)

    Drake, M.C.; Rosenblatt, G.M.

    1979-01-01

    Raman spectroscopy (largely because of advances in laser and detector technology) is assuming a rapidly expanding role in many areas of research. This paper reviews the contribution of Raman spectroscopy in high temperature chemistry including molecular spectroscopy on static systems and gas diagnostic measurements on reactive systems. An important aspect of high temperature chemistry has been the identification and study of the new, and often unusual, gaseous molecules which form at high temperatures. Particularly important is the investigation of vibrational-rotational energy levels and electronic states which determine thermodynamic properties and describe chemical bonding. Some advantages and disadvantages of high temperature Raman spectrosocpy for molecular studies on static systems are compared: (1) Raman vs infrared; (2) gas-phase vs condensed in matrices; and (3) atmospheric pressure Raman vs low pressure techniques, including mass spectroscopy, matrix isolation, and molecular beams. Raman studies on molecular properties of gases, melts, and surfaces are presented with emphasis on work not covered in previous reviews of high temperature and matrix isolation Raman spectroscopy

  16. High-throughput computational methods and software for quantitative trait locus (QTL) mapping

    NARCIS (Netherlands)

    Arends, Danny

    2014-01-01

    De afgelopen jaren zijn vele nieuwe technologieen zoals Tiling arrays en High throughput DNA sequencing een belangrijke rol gaan spelen binnen het onderzoeksveld van de systeem genetica. Voor onderzoekers is het extreem belangrijk om te begrijpen dat deze methodes hun manier van werken zullen gaan

  17. Insights into Sonogashira cross-coupling by high-throughput kinetics and descriptor modeling

    NARCIS (Netherlands)

    an der Heiden, M.R.; Plenio, H.; Immel, S.; Burello, E.; Rothenberg, G.; Hoefsloot, H.C.J.

    2008-01-01

    A method is presented for the high-throughput monitoring of reaction kinetics in homogeneous catalysis, running up to 25 coupling reactions in a single reaction vessel. This method is demonstrated and validated on the Sonogashira reaction, analyzing the kinetics for almost 500 coupling reactions.

  18. Development of rapid high throughput biodosimetry tools for radiological triage

    International Nuclear Information System (INIS)

    Balajee, Adayabalam S.; Escalona, Maria; Smith, Tammy; Ryan, Terri; Dainiak, Nicholas

    2018-01-01

    Accidental or intentional radiological or nuclear (R/N) disasters constitute a major threat around the globe that can affect several tens, hundreds and thousands of humans. Currently available cytogenetic biodosimeters are time consuming and laborious to perform making them impractical for triage scenarios. Therefore, it is imperative to develop high throughput techniques which will enable timely assessment of personalized dose for making an appropriate 'life-saving' clinical decision

  19. Intel: High Throughput Computing Collaboration: A CERN openlab / Intel collaboration

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Intel/CERN High Throughput Computing Collaboration studies the application of upcoming Intel technologies to the very challenging environment of the LHC trigger and data-acquisition systems. These systems will need to transport and process many terabits of data every second, in some cases with tight latency constraints. Parallelisation and tight integration of accelerators and classical CPU via Intel's OmniPath fabric are the key elements in this project.

  20. The structural chemistry of metallocorroles: combined X-ray crystallography and quantum chemistry studies afford unique insights.

    Science.gov (United States)

    Thomas, Kolle E; Alemayehu, Abraham B; Conradie, Jeanet; Beavers, Christine M; Ghosh, Abhik

    2012-08-21

    Although they share some superficial structural similarities with porphyrins, corroles, trianionic ligands with contracted cores, give rise to fundamentally different transition metal complexes in comparison with the dianionic porphyrins. Many metallocorroles are formally high-valent, although a good fraction of them are also noninnocent, with significant corrole radical character. These electronic-structural characteristics result in a variety of fascinating spectroscopic behavior, including highly characteristic, paramagnetically shifted NMR spectra and textbook cases of charge-transfer spectra. Although our early research on corroles focused on spectroscopy, we soon learned that the geometric structures of metallocorroles provide a fascinating window into their electronic-structural characteristics. Thus, we used X-ray structure determinations and quantum chemical studies, chiefly using DFT, to obtain a comprehensive understanding of metallocorrole geometric and electronic structures. This Account describes our studies of the structural chemistry of metallocorroles. At first blush, the planar or mildly domed structure of metallocorroles might appear somewhat uninteresting particularly when compared to metalloporphyrins. Metalloporphyrins, especially sterically hindered ones, are routinely ruffled or saddled, but the missing meso carbon apparently makes the corrole skeleton much more resistant to nonplanar distortions. Ruffling, where the pyrrole rings are alternately twisted about the M-N bonds, is energetically impossible for metallocorroles. Saddling is also uncommon; thus, a number of sterically hindered, fully substituted metallocorroles exhibit almost perfectly planar macrocycle cores. Against this backdrop, copper corroles stand out as an important exception. As a result of an energetically favorable Cu(d(x2-y2))-corrole(π) orbital interaction, copper corroles, even sterically unhindered ones, are inherently saddled. Sterically hindered substituents

  1. Printing Proteins as Microarrays for High-Throughput Function Determination

    Science.gov (United States)

    MacBeath, Gavin; Schreiber, Stuart L.

    2000-09-01

    Systematic efforts are currently under way to construct defined sets of cloned genes for high-throughput expression and purification of recombinant proteins. To facilitate subsequent studies of protein function, we have developed miniaturized assays that accommodate extremely low sample volumes and enable the rapid, simultaneous processing of thousands of proteins. A high-precision robot designed to manufacture complementary DNA microarrays was used to spot proteins onto chemically derivatized glass slides at extremely high spatial densities. The proteins attached covalently to the slide surface yet retained their ability to interact specifically with other proteins, or with small molecules, in solution. Three applications for protein microarrays were demonstrated: screening for protein-protein interactions, identifying the substrates of protein kinases, and identifying the protein targets of small molecules.

  2. A high throughput biochemical fluorometric method for measuring lipid peroxidation in HDL.

    Directory of Open Access Journals (Sweden)

    Theodoros Kelesidis

    Full Text Available Current cell-based assays for determining the functional properties of high-density lipoproteins (HDL have limitations. We report here the development of a new, robust fluorometric cell-free biochemical assay that measures HDL lipid peroxidation (HDLox based on the oxidation of the fluorochrome Amplex Red. HDLox correlated with previously validated cell-based (r = 0.47, p<0.001 and cell-free assays (r = 0.46, p<0.001. HDLox distinguished dysfunctional HDL in established animal models of atherosclerosis and Human Immunodeficiency Virus (HIV patients. Using an immunoaffinity method for capturing HDL, we demonstrate the utility of this novel assay for measuring HDLox in a high throughput format. Furthermore, HDLox correlated significantly with measures of cardiovascular diseases including carotid intima media thickness (r = 0.35, p<0.01 and subendocardial viability ratio (r = -0.21, p = 0.05 and physiological parameters such as metabolic and anthropometric parameters (p<0.05. In conclusion, we report the development of a new fluorometric method that offers a reproducible and rapid means for determining HDL function/quality that is suitable for high throughput implementation.

  3. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration...... of soft magnetic elements in the chip leads to a slightly higher capturing efficiency and a more uniform distribution of captured beads over the separation chamber than the system without soft magnetic elements....

  4. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  5. A high-throughput and quantitative method to assess the mutagenic potential of translesion DNA synthesis

    Science.gov (United States)

    Taggart, David J.; Camerlengo, Terry L.; Harrison, Jason K.; Sherrer, Shanen M.; Kshetry, Ajay K.; Taylor, John-Stephen; Huang, Kun; Suo, Zucai

    2013-01-01

    Cellular genomes are constantly damaged by endogenous and exogenous agents that covalently and structurally modify DNA to produce DNA lesions. Although most lesions are mended by various DNA repair pathways in vivo, a significant number of damage sites persist during genomic replication. Our understanding of the mutagenic outcomes derived from these unrepaired DNA lesions has been hindered by the low throughput of existing sequencing methods. Therefore, we have developed a cost-effective high-throughput short oligonucleotide sequencing assay that uses next-generation DNA sequencing technology for the assessment of the mutagenic profiles of translesion DNA synthesis catalyzed by any error-prone DNA polymerase. The vast amount of sequencing data produced were aligned and quantified by using our novel software. As an example, the high-throughput short oligonucleotide sequencing assay was used to analyze the types and frequencies of mutations upstream, downstream and at a site-specifically placed cis–syn thymidine–thymidine dimer generated individually by three lesion-bypass human Y-family DNA polymerases. PMID:23470999

  6. The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).

    Science.gov (United States)

    Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi

    2018-01-01

    Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.

  7. A High-Throughput Antibody-Based Microarray Typing Platform

    Directory of Open Access Journals (Sweden)

    Ashan Perera

    2013-05-01

    Full Text Available Many rapid methods have been developed for screening foods for the presence of pathogenic microorganisms. Rapid methods that have the additional ability to identify microorganisms via multiplexed immunological recognition have the potential for classification or typing of microbial contaminants thus facilitating epidemiological investigations that aim to identify outbreaks and trace back the contamination to its source. This manuscript introduces a novel, high throughput typing platform that employs microarrayed multiwell plate substrates and laser-induced fluorescence of the nucleic acid intercalating dye/stain SYBR Gold for detection of antibody-captured bacteria. The aim of this study was to use this platform for comparison of different sets of antibodies raised against the same pathogens as well as demonstrate its potential effectiveness for serotyping. To that end, two sets of antibodies raised against each of the “Big Six” non-O157 Shiga toxin-producing E. coli (STEC as well as E. coli O157:H7 were array-printed into microtiter plates, and serial dilutions of the bacteria were added and subsequently detected. Though antibody specificity was not sufficient for the development of an STEC serotyping method, the STEC antibody sets performed reasonably well exhibiting that specificity increased at lower capture antibody concentrations or, conversely, at lower bacterial target concentrations. The favorable results indicated that with sufficiently selective and ideally concentrated sets of biorecognition elements (e.g., antibodies or aptamers, this high-throughput platform can be used to rapidly type microbial isolates derived from food samples within ca. 80 min of total assay time. It can also potentially be used to detect the pathogens from food enrichments and at least serve as a platform for testing antibodies.

  8. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    Science.gov (United States)

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  9. High-speed noise-free optical quantum memory

    Science.gov (United States)

    Kaczmarek, K. T.; Ledingham, P. M.; Brecht, B.; Thomas, S. E.; Thekkadath, G. S.; Lazo-Arjona, O.; Munns, J. H. D.; Poem, E.; Feizpour, A.; Saunders, D. J.; Nunn, J.; Walmsley, I. A.

    2018-04-01

    Optical quantum memories are devices that store and recall quantum light and are vital to the realization of future photonic quantum networks. To date, much effort has been put into improving storage times and efficiencies of such devices to enable long-distance communications. However, less attention has been devoted to building quantum memories which add zero noise to the output. Even small additional noise can render the memory classical by destroying the fragile quantum signatures of the stored light. Therefore, noise performance is a critical parameter for all quantum memories. Here we introduce an intrinsically noise-free quantum memory protocol based on two-photon off-resonant cascaded absorption (ORCA). We demonstrate successful storage of GHz-bandwidth heralded single photons in a warm atomic vapor with no added noise, confirmed by the unaltered photon-number statistics upon recall. Our ORCA memory meets the stringent noise requirements for quantum memories while combining high-speed and room-temperature operation with technical simplicity, and therefore is immediately applicable to low-latency quantum networks.

  10. High temperature water chemistry monitoring

    International Nuclear Information System (INIS)

    Aaltonen, P.

    1992-01-01

    Almost all corrosion phenomena in nuclear power plants can be prevented or at least damped by water chemistry control or by the change of water chemistry control or by the change of water chemistry. Successful water chemistry control needs regular and continuous monitoring of such water chemistry parameters like dissolved oxygen content, pH, conductivity and impurity contents. Conventionally the monitoring is carried out at low pressures and temperatures, which method, however, has some shortcomings. Recently electrodes have been developed which enables the direct monitoring at operating pressures and temperatures. (author). 2 refs, 5 figs

  11. High-throughput measurement of polymer film thickness using optical dyes

    Science.gov (United States)

    Grunlan, Jaime C.; Mehrabi, Ali R.; Ly, Tien

    2005-01-01

    Optical dyes were added to polymer solutions in an effort to create a technique for high-throughput screening of dry polymer film thickness. Arrays of polystyrene films, cast from a toluene solution, containing methyl red or solvent green were used to demonstrate the feasibility of this technique. Measurements of the peak visible absorbance of each film were converted to thickness using the Beer-Lambert relationship. These absorbance-based thickness calculations agreed within 10% of thickness measured using a micrometer for polystyrene films that were 10-50 µm. At these thicknesses it is believed that the absorbance values are actually more accurate. At least for this solvent-based system, thickness was shown to be accurately measured in a high-throughput manner that could potentially be applied to other equivalent systems. Similar water-based films made with poly(sodium 4-styrenesulfonate) dyed with malachite green oxalate or congo red did not show the same level of agreement with the micrometer measurements. Extensive phase separation between polymer and dye resulted in inflated absorbance values and calculated thickness that was often more than 25% greater than that measured with the micrometer. Only at thicknesses below 15 µm could reasonable accuracy be achieved for the water-based films.

  12. Time-dependent quantum chemistry of laser driven many-electron molecules

    International Nuclear Information System (INIS)

    Nguyen-Dang, Thanh-Tung; Couture-Bienvenue, Étienne; Viau-Trudel, Jérémy; Sainjon, Amaury

    2014-01-01

    A Time-Dependent Configuration Interaction approach using multiple Feshbach partitionings, corresponding to multiple ionization stages of a laser-driven molecule, has recently been proposed [T.-T. Nguyen-Dang and J. Viau-Trudel, J. Chem. Phys. 139, 244102 (2013)]. To complete this development toward a fully ab-initio method for the calculation of time-dependent electronic wavefunctions of an N-electron molecule, we describe how tools of multiconfiguration quantum chemistry such as the management of the configuration expansion space using Graphical Unitary Group Approach concepts can be profitably adapted to the new context, that of time-resolved electronic dynamics, as opposed to stationary electronic structure. The method is applied to calculate the detailed, sub-cycle electronic dynamics of BeH 2 , treated in a 3–21G bound-orbital basis augmented by a set of orthogonalized plane-waves representing continuum-type orbitals, including its ionization under an intense λ = 800 nm or λ = 80 nm continuous-wave laser field. The dynamics is strongly non-linear at the field-intensity considered (I ≃ 10 15 W/cm 2 ), featuring important ionization of an inner-shell electron and strong post-ionization bound-electron dynamics

  13. Quantum-path control in high-order harmonic generation at high photon energies

    International Nuclear Information System (INIS)

    Zhang Xiaoshi; Lytle, Amy L; Cohen, Oren; Murnane, Margaret M; Kapteyn, Henry C

    2008-01-01

    We show through experiment and calculations how all-optical quasi-phase-matching of high-order harmonic generation can be used to selectively enhance emission from distinct quantum trajectories at high photon energies. Electrons rescattered in a strong field can traverse short and long quantum trajectories that exhibit differing coherence lengths as a result of variations in intensity of the driving laser along the direction of propagation. By varying the separation of the pulses in a counterpropagating pulse train, we selectively enhance either the long or the short quantum trajectory, and observe distinct spectral signatures in each case. This demonstrates a new type of coupling between the coherence of high-order harmonic beams and the attosecond time-scale quantum dynamics inherent in the process

  14. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  15. Application of high-throughput DNA sequencing in phytopathology.

    Science.gov (United States)

    Studholme, David J; Glover, Rachel H; Boonham, Neil

    2011-01-01

    The new sequencing technologies are already making a big impact in academic research on medically important microbes and may soon revolutionize diagnostics, epidemiology, and infection control. Plant pathology also stands to gain from exploiting these opportunities. This manuscript reviews some applications of these high-throughput sequencing methods that are relevant to phytopathology, with emphasis on the associated computational and bioinformatics challenges and their solutions. Second-generation sequencing technologies have recently been exploited in genomics of both prokaryotic and eukaryotic plant pathogens. They are also proving to be useful in diagnostics, especially with respect to viruses. Copyright © 2011 by Annual Reviews. All rights reserved.

  16. High-throughput ab-initio dilute solute diffusion database.

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-19

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  17. Highly Efficient Spontaneous Emission from Self-Assembled Quantum Dots

    DEFF Research Database (Denmark)

    Johansen, Jeppe; Lund-Hansen, Toke; Hvam, Jørn Märcher

    2006-01-01

    We present time resolved measurements of spontaneous emission (SE) from InAs/GaAs quantum dots (QDs). The measurements are interpreted using Fermi's Golden Rule and from this analysis we establish the parameters for high quantum efficiency.......We present time resolved measurements of spontaneous emission (SE) from InAs/GaAs quantum dots (QDs). The measurements are interpreted using Fermi's Golden Rule and from this analysis we establish the parameters for high quantum efficiency....

  18. The status of safety in the public high school chemistry laboratories in Mississippi

    Science.gov (United States)

    Lacy, Sarah Louise Trotman

    Since laboratory-based science courses have become an essential element of any science curriculum and are required by the Mississippi Department of Education for graduation, the chemistry laboratories in the public high schools in Mississippi must be safe. The purpose of this study was to determine: the safety characteristics of a high school chemistry laboratory; the perceived safety characteristics of the chemistry laboratories in public high schools in Mississippi; the basic safety knowledge of chemistry teachers in public high schools in Mississippi, where chemistry teachers in Mississippi gain knowledge about laboratory safety and instruction; if public high school chemistry laboratories in Mississippi adhere to recommended class size, laboratory floor space per student, safety education, safety equipment, and chemical storage; and the relationship between teacher knowledge of chemistry laboratory safety and the safety status of the laboratory in which they teach. The survey instrument was composed of three parts. Part I Teacher Knowledge consisted of 23 questions concerning high school chemistry laboratory safety. Part II Chemistry Laboratory Safety Information consisted of 40 items divided into four areas of interest concerning safety in high school chemistry laboratories. Part III Demographics consisted of 11 questions relating to teacher certification, experience, education, and safety training. The survey was mailed to a designated chemistry teacher in every public high school in Mississippi. The responses to Part I of the survey indicated that the majority of the teachers have a good understanding of knowledge about chemistry laboratory safety but need more instruction on the requirements for a safe high school chemistry laboratory. Less than 50% of the responding teachers thought they had received adequate preparation from their college classes to conduct a safe chemistry laboratory. According to the responses of the teachers, most of their high school

  19. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  20. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    Science.gov (United States)

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.