WorldWideScience

Sample records for non-radioactive dapi-based high-throughput

  1. Development and application of a fluorescent glucose uptake assay for the high-throughput screening of non-glycoside SGLT2 inhibitors.

    Science.gov (United States)

    Wu, Szu-Huei; Yao, Chun-Hsu; Hsieh, Chieh-Jui; Liu, Yu-Wei; Chao, Yu-Sheng; Song, Jen-Shin; Lee, Jinq-Chyi

    2015-07-10

    Sodium-dependent glucose co-transporter 2 (SGLT2) inhibitors are of current interest as a treatment for type 2 diabetes. Efforts have been made to discover phlorizin-related glycosides with good SGLT2 inhibitory activity. To increase structural diversity and better understand the role of non-glycoside SGLT2 inhibitors on glycemic control, we initiated a research program to identify non-glycoside hits from high-throughput screening. Here, we report the development of a novel, fluorogenic probe-based glucose uptake system based on a Cu(I)-catalyzed [3+2] cycloaddition. The safer processes and cheaper substances made the developed assay our first priority for large-scale primary screening as compared to the well-known [(14)C]-labeled α-methyl-D-glucopyranoside ([(14)C]-AMG) radioactive assay. This effort culminated in the identification of a benzimidazole, non-glycoside SGLT2 hit with an EC50 value of 0.62 μM by high-throughput screening of 41,000 compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Vision-based Nano Robotic System for High-throughput Non-embedded Cell Cutting.

    Science.gov (United States)

    Shang, Wanfeng; Lu, Haojian; Wan, Wenfeng; Fukuda, Toshio; Shen, Yajing

    2016-03-04

    Cell cutting is a significant task in biology study, but the highly productive non-embedded cell cutting is still a big challenge for current techniques. This paper proposes a vision-based nano robotic system and then realizes automatic non-embedded cell cutting with this system. First, the nano robotic system is developed and integrated with a nanoknife inside an environmental scanning electron microscopy (ESEM). Then, the positions of the nanoknife and the single cell are recognized, and the distance between them is calculated dynamically based on image processing. To guarantee the positioning accuracy and the working efficiency, we propose a distance-regulated speed adapting strategy, in which the moving speed is adjusted intelligently based on the distance between the nanoknife and the target cell. The results indicate that the automatic non-embedded cutting is able to be achieved within 1-2 mins with low invasion benefiting from the high precise nanorobot system and the sharp edge of nanoknife. This research paves a way for the high-throughput cell cutting at cell's natural condition, which is expected to make significant impact on the biology studies, especially for the in-situ analysis at cellular and subcellular scale, such as cell interaction investigation, neural signal transduction and low invasive cell surgery.

  3. Vision-based Nano Robotic System for High-throughput Non-embedded Cell Cutting

    Science.gov (United States)

    Shang, Wanfeng; Lu, Haojian; Wan, Wenfeng; Fukuda, Toshio; Shen, Yajing

    2016-03-01

    Cell cutting is a significant task in biology study, but the highly productive non-embedded cell cutting is still a big challenge for current techniques. This paper proposes a vision-based nano robotic system and then realizes automatic non-embedded cell cutting with this system. First, the nano robotic system is developed and integrated with a nanoknife inside an environmental scanning electron microscopy (ESEM). Then, the positions of the nanoknife and the single cell are recognized, and the distance between them is calculated dynamically based on image processing. To guarantee the positioning accuracy and the working efficiency, we propose a distance-regulated speed adapting strategy, in which the moving speed is adjusted intelligently based on the distance between the nanoknife and the target cell. The results indicate that the automatic non-embedded cutting is able to be achieved within 1-2 mins with low invasion benefiting from the high precise nanorobot system and the sharp edge of nanoknife. This research paves a way for the high-throughput cell cutting at cell’s natural condition, which is expected to make significant impact on the biology studies, especially for the in-situ analysis at cellular and subcellular scale, such as cell interaction investigation, neural signal transduction and low invasive cell surgery.

  4. Noise and non-linearities in high-throughput data

    International Nuclear Information System (INIS)

    Nguyen, Viet-Anh; Lió, Pietro; Koukolíková-Nicola, Zdena; Bagnoli, Franco

    2009-01-01

    High-throughput data analyses are becoming common in biology, communications, economics and sociology. The vast amounts of data are usually represented in the form of matrices and can be considered as knowledge networks. Spectra-based approaches have proved useful in extracting hidden information within such networks and for estimating missing data, but these methods are based essentially on linear assumptions. The physical models of matching, when applicable, often suggest non-linear mechanisms, that may sometimes be identified as noise. The use of non-linear models in data analysis, however, may require the introduction of many parameters, which lowers the statistical weight of the model. According to the quality of data, a simpler linear analysis may be more convenient than more complex approaches. In this paper, we show how a simple non-parametric Bayesian model may be used to explore the role of non-linearities and noise in synthetic and experimental data sets

  5. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  6. Non-radioactive stand-in for radioactive contamination. I. Non-radioactive tests

    International Nuclear Information System (INIS)

    Rohe, M.J.; Rankin, W.N.; Postles, R.L.

    1985-01-01

    Candidate non-radioactive materials for use as a stand-in for radioactive contamination during application of a high-pressure, hot water decontamination were identified and evaluated. A stand-in for radioactive contamination is needed to evaluate the decontaminability of replacement canyon cranes at the manufacturers location where actual radioactive contamination cannot be used. This evaluation was conducted using high-pressure, hot-water at 420 psi, 190 0 F, and 20 gal/min through a 1/8-in.-diam nozzle, the decontamination technique preferred by SRP Separations Department for this application. A non-radioactive stand-in for radioactive contamination was desired that would be removed by direct blast stream contact but would remain intact on surfaces where direct contact does not occur. This memorandum describes identification of candidate non-radioactive stand-in materials and evaluation of these materials in screening tests and tests with high-pressure, hot-water blasting. The following non-radioactive materials were tested: carpenter's line chalk; typing correction fluid; dye penetrant developer; latex paint with attapulyite added; unaltered latex paint; gold enamel; layout fluid; and black enamel. Results show that blue layout fluid and gold enamel have similar adherence that is within the range expected for actual radioactive contamination. White latex paint has less adherence than expected for actual radioactive contamination. The film was removed at a rate of 2 . Black enamel has more adherence than expected from actual radioactive contamination. In these tests ASTM No. 2B surfaces were harder to clean than either ASTM No. 1 or electropolished surfaces which had similar cleaning properties. A 90 0 blast angle was more effective than a 45 0 blast angle. In these tests there was no discernible effect of blast distance between 1 and 3 ft

  7. High-throughput GPU-based LDPC decoding

    Science.gov (United States)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  8. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  9. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  10. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  11. High-throughput kinase assays with protein substrates using fluorescent polymer superquenching

    Directory of Open Access Journals (Sweden)

    Weatherford Wendy

    2005-05-01

    Full Text Available Abstract Background High-throughput screening is used by the pharmaceutical industry for identifying lead compounds that interact with targets of pharmacological interest. Because of the key role that aberrant regulation of protein phosphorylation plays in diseases such as cancer, diabetes and hypertension, kinases have become one of the main drug targets. With the exception of antibody-based assays, methods to screen for specific kinase activity are generally restricted to the use of small synthetic peptides as substrates. However, the use of natural protein substrates has the advantage that potential inhibitors can be detected that affect enzyme activity by binding to a site other than the catalytic site. We have previously reported a non-radioactive and non-antibody-based fluorescence quench assay for detection of phosphorylation or dephosphorylation using synthetic peptide substrates. The aim of this work is to develop an assay for detection of phosphorylation of chemically unmodified proteins based on this polymer superquenching platform. Results Using a modified QTL Lightspeed™ assay, phosphorylation of native protein was quantified by the interaction of the phosphorylated proteins with metal-ion coordinating groups co-located with fluorescent polymer deposited onto microspheres. The binding of phospho-protein inhibits a dye-labeled "tracer" peptide from associating to the phosphate-binding sites present on the fluorescent microspheres. The resulting inhibition of quench generates a "turn on" assay, in which the signal correlates with the phosphorylation of the substrate. The assay was tested on three different proteins: Myelin Basic Protein (MBP, Histone H1 and Phosphorylated heat- and acid-stable protein (PHAS-1. Phosphorylation of the proteins was detected by Protein Kinase Cα (PKCα and by the Interleukin -1 Receptor-associated Kinase 4 (IRAK4. Enzyme inhibition yielded IC50 values that were comparable to those obtained using

  12. High-throughput kinase assays with protein substrates using fluorescent polymer superquenching.

    Science.gov (United States)

    Rininsland, Frauke; Stankewicz, Casey; Weatherford, Wendy; McBranch, Duncan

    2005-05-31

    High-throughput screening is used by the pharmaceutical industry for identifying lead compounds that interact with targets of pharmacological interest. Because of the key role that aberrant regulation of protein phosphorylation plays in diseases such as cancer, diabetes and hypertension, kinases have become one of the main drug targets. With the exception of antibody-based assays, methods to screen for specific kinase activity are generally restricted to the use of small synthetic peptides as substrates. However, the use of natural protein substrates has the advantage that potential inhibitors can be detected that affect enzyme activity by binding to a site other than the catalytic site. We have previously reported a non-radioactive and non-antibody-based fluorescence quench assay for detection of phosphorylation or dephosphorylation using synthetic peptide substrates. The aim of this work is to develop an assay for detection of phosphorylation of chemically unmodified proteins based on this polymer superquenching platform. Using a modified QTL Lightspeed assay, phosphorylation of native protein was quantified by the interaction of the phosphorylated proteins with metal-ion coordinating groups co-located with fluorescent polymer deposited onto microspheres. The binding of phospho-protein inhibits a dye-labeled "tracer" peptide from associating to the phosphate-binding sites present on the fluorescent microspheres. The resulting inhibition of quench generates a "turn on" assay, in which the signal correlates with the phosphorylation of the substrate. The assay was tested on three different proteins: Myelin Basic Protein (MBP), Histone H1 and Phosphorylated heat- and acid-stable protein (PHAS-1). Phosphorylation of the proteins was detected by Protein Kinase Calpha (PKCalpha) and by the Interleukin -1 Receptor-associated Kinase 4 (IRAK4). Enzyme inhibition yielded IC50 values that were comparable to those obtained using peptide substrates. Statistical parameters that

  13. New Insights into the in situ Microscopic Visualization and Quantification of Inorganic Polyphosphate Stores by 4’,6-Diamidino-2-Phenylindole (DAPI)-Staining

    Science.gov (United States)

    Gomes, F.M.; Ramos, I.B.; Wendt, C.; Girard-Dias, W.; De Souza, W.; Machado, E.A.; K. Miranda, E.A.

    2013-01-01

    Inorganic polyphosphate (PolyP) is a biological polymer that plays important roles in the cell physiology of both prokaryotic and eukaryotic organisms. Among the available methods for PolyP localization and quantification, a 4’,6-diamidino-2-phenylindole(DAPI)-based assay has been used for visualization of PolyP-rich organelles. Due to differences in DAPI permeability to different compartments and/or PolyP retention after fixation, a general protocol for DAPI-PolyP staining has not yet been established. Here, we tested different protocols for DAPI-PolyP detection in a range of samples with different levels of DAPI permeability, including subcellular fractions, free-living cells and cryosections of fixed tissues. Subcellular fractions of PolyP-rich organelles yielded DAPI-PolyP fluorescence, although those with a complex external layer usually required longer incubation times, previous aldehyde fixation and/or detergent permeabilization. DAPI-PolyP was also detected in cryosections of OCT-embedded tissues analyzed by multiphoton microscopy. In addition, a semi-quantitative fluorimetric analysis of DAPI-stained fractions showed PolyP mobilization in a similar fashion to what has been demonstrated with the use of enzyme-based quantitative protocols. Taken together, our results support the use of DAPI for both PolyP visualization and quantification, although specific steps are suggested as a general guideline for DAPI-PolyP staining in biological samples with different degrees of DAPI and PolyP permeability. PMID:24441187

  14. Operational evaluation of high-throughput community-based mass prophylaxis using Just-in-time training.

    Science.gov (United States)

    Spitzer, James D; Hupert, Nathaniel; Duckart, Jonathan; Xiong, Wei

    2007-01-01

    Community-based mass prophylaxis is a core public health operational competency, but staffing needs may overwhelm the local trained health workforce. Just-in-time (JIT) training of emergency staff and computer modeling of workforce requirements represent two complementary approaches to address this logistical problem. Multnomah County, Oregon, conducted a high-throughput point of dispensing (POD) exercise to test JIT training and computer modeling to validate POD staffing estimates. The POD had 84% non-health-care worker staff and processed 500 patients per hour. Post-exercise modeling replicated observed staff utilization levels and queue formation, including development and amelioration of a large medical evaluation queue caused by lengthy processing times and understaffing in the first half-hour of the exercise. The exercise confirmed the feasibility of using JIT training for high-throughput antibiotic dispensing clinics staffed largely by nonmedical professionals. Patient processing times varied over the course of the exercise, with important implications for both staff reallocation and future POD modeling efforts. Overall underutilization of staff revealed the opportunity for greater efficiencies and even higher future throughputs.

  15. High-content, high-throughput screening for the identification of cytotoxic compounds based on cell morphology and cell proliferation markers.

    Directory of Open Access Journals (Sweden)

    Heather L Martin

    Full Text Available Toxicity is a major cause of failure in drug discovery and development, and whilst robust toxicological testing occurs, efficiency could be improved if compounds with cytotoxic characteristics were identified during primary compound screening. The use of high-content imaging in primary screening is becoming more widespread, and by utilising phenotypic approaches it should be possible to incorporate cytotoxicity counter-screens into primary screens. Here we present a novel phenotypic assay that can be used as a counter-screen to identify compounds with adverse cellular effects. This assay has been developed using U2OS cells, the PerkinElmer Operetta high-content/high-throughput imaging system and Columbus image analysis software. In Columbus, algorithms were devised to identify changes in nuclear morphology, cell shape and proliferation using DAPI, TOTO-3 and phosphohistone H3 staining, respectively. The algorithms were developed and tested on cells treated with doxorubicin, taxol and nocodazole. The assay was then used to screen a novel, chemical library, rich in natural product-like molecules of over 300 compounds, 13.6% of which were identified as having adverse cellular effects. This assay provides a relatively cheap and rapid approach for identifying compounds with adverse cellular effects during screening assays, potentially reducing compound rejection due to toxicity in subsequent in vitro and in vivo assays.

  16. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  17. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  18. Fluorescence-based high-throughput screening of dicer cleavage activity.

    Science.gov (United States)

    Podolska, Katerina; Sedlak, David; Bartunek, Petr; Svoboda, Petr

    2014-03-01

    Production of small RNAs by ribonuclease III Dicer is a key step in microRNA and RNA interference pathways, which employ Dicer-produced small RNAs as sequence-specific silencing guides. Further studies and manipulations of microRNA and RNA interference pathways would benefit from identification of small-molecule modulators. Here, we report a study of a fluorescence-based in vitro Dicer cleavage assay, which was adapted for high-throughput screening. The kinetic assay can be performed under single-turnover conditions (35 nM substrate and 70 nM Dicer) in a small volume (5 µL), which makes it suitable for high-throughput screening in a 1536-well format. As a proof of principle, a small library of bioactive compounds was analyzed, demonstrating potential of the assay.

  19. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  20. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    Science.gov (United States)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  1. Effectiveness of a high-throughput genetic analysis in the identification of responders/non-responders to CYP2D6-metabolized drugs.

    Science.gov (United States)

    Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A

    2011-01-01

    Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.

  2. Identification of fluorescent compounds with non-specific binding property via high throughput live cell microscopy.

    Directory of Open Access Journals (Sweden)

    Sangeeta Nath

    Full Text Available INTRODUCTION: Compounds exhibiting low non-specific intracellular binding or non-stickiness are concomitant with rapid clearing and in high demand for live-cell imaging assays because they allow for intracellular receptor localization with a high signal/noise ratio. The non-stickiness property is particularly important for imaging intracellular receptors due to the equilibria involved. METHOD: Three mammalian cell lines with diverse genetic backgrounds were used to screen a combinatorial fluorescence library via high throughput live cell microscopy for potential ligands with high in- and out-flux properties. The binding properties of ligands identified from the first screen were subsequently validated on plant root hair. A correlative analysis was then performed between each ligand and its corresponding physiochemical and structural properties. RESULTS: The non-stickiness property of each ligand was quantified as a function of the temporal uptake and retention on a cell-by-cell basis. Our data shows that (i mammalian systems can serve as a pre-screening tool for complex plant species that are not amenable to high-throughput imaging; (ii retention and spatial localization of chemical compounds vary within and between each cell line; and (iii the structural similarities of compounds can infer their non-specific binding properties. CONCLUSION: We have validated a protocol for identifying chemical compounds with non-specific binding properties that is testable across diverse species. Further analysis reveals an overlap between the non-stickiness property and the structural similarity of compounds. The net result is a more robust screening assay for identifying desirable ligands that can be used to monitor intracellular localization. Several new applications of the screening protocol and results are also presented.

  3. High-throughput Sequencing Based Immune Repertoire Study during Infectious Disease

    Directory of Open Access Journals (Sweden)

    Dongni Hou

    2016-08-01

    Full Text Available The selectivity of the adaptive immune response is based on the enormous diversity of T and B cell antigen-specific receptors. The immune repertoire, the collection of T and B cells with functional diversity in the circulatory system at any given time, is dynamic and reflects the essence of immune selectivity. In this article, we review the recent advances in immune repertoire study of infectious diseases that achieved by traditional techniques and high-throughput sequencing techniques. High-throughput sequencing techniques enable the determination of complementary regions of lymphocyte receptors with unprecedented efficiency and scale. This progress in methodology enhances the understanding of immunologic changes during pathogen challenge, and also provides a basis for further development of novel diagnostic markers, immunotherapies and vaccines.

  4. Measuring topology of low-intensity DNA methylation sites for high-throughput assessment of epigenetic drug-induced effects in cancer cells

    International Nuclear Information System (INIS)

    Gertych, Arkadiusz; Farkas, Daniel L.; Tajbakhsh, Jian

    2010-01-01

    Epigenetic anti-cancer drugs with demethylating effects have shown to alter genome organization in mammalian cell nuclei. The interest in the development of novel epigenetic drugs has increased the demand for cell-based assays to evaluate drug performance in pre-clinical studies. An imaging-based cytometrical approach that can measure demethylation effects as changes in the spatial nuclear distributions of methylated cytosine and global DNA in cancer cells is introduced in this paper. The cells were studied by immunofluorescence with a specific antibody against 5-methylcytosine (MeC), and 4,6-diamidino-2-phenylindole (DAPI) for delineation of methylated sites and global DNA in nuclei. In the preprocessing step the segmentation of nuclei in three-dimensional images (3-D) is followed by an automated assessment of nuclear DAPI/MeC patterns to exclude dissimilar entities. Next, low-intensity MeC (LIM) and low-intensity DNA (LID) sites of similar nuclei are localized and processed to obtain specific nuclear density profiles. These profiles sampled at half of the total nuclear volume yielded two parameters: LIM 0.5 and LID 0.5 . The analysis shows that zebularine and 5-azacytidine-the two tested epigenetic drugs introduce changes in the spatial distribution of low-intensity DNA and MeC signals. LIM 0.5 and LID 0.5 were significantly different (p < 0.001) in 5-azacytidine treated (n = 660) and zebularine treated (n = 496) vs. untreated (n = 649) DU145 human prostate cancer cells. In the latter case the LIM sites were predominantly found at the nuclear border, whereas treated populations showed different degrees of increase in LIMs towards the interior nuclear space, in which a large portion of heterochromatin is located. The cell-by-cell evaluation of changes in the spatial reorganization of MeC/DAPI signals revealed that zebularine is a more gentle demethylating agent than 5-azacytidine. Measuring changes in the topology of low-intensity sites can potentially be a

  5. A multilayer microdevice for cell-based high-throughput drug screening

    International Nuclear Information System (INIS)

    Liu, Chong; Wang, Lei; Li, Jingmin; Ding, Xiping; Chunyu, Li; Xu, Zheng; Wang, Qi

    2012-01-01

    A multilayer polydimethylsiloxane microdevice for cell-based high-throughput drug screening is described in this paper. This established microdevice was based on a modularization method and it integrated a drug/medium concentration gradient generator (CGG), pneumatic microvalves and a cell culture microchamber array. The CGG was able to generate five steps of linear concentrations with the same outlet flow rate. The medium/drug flowed through CGG and then into the pear-shaped cell culture microchambers vertically. This vertical perfusion mode was used to reduce the impact of the shear stress on the physiology of cells induced by the fluid flow in the microchambers. Pear-shaped microchambers with two arrays of miropillars at each outlet were adopted in this microdevice, which were beneficial to cell distribution. The chemotherapeutics Cisplatin (DDP)-induced Cisplatin-resistant cell line A549/DDP apoptotic experiments were performed well on this platform. The results showed that this novel microdevice could not only provide well-defined and stable conditions for cell culture, but was also useful for cell-based high-throughput drug screening with less reagents and time consumption. (paper)

  6. High-throughput characterization methods for lithium batteries

    Directory of Open Access Journals (Sweden)

    Yingchun Lyu

    2017-09-01

    Full Text Available The development of high-performance lithium ion batteries requires the discovery of new materials and the optimization of key components. By contrast with traditional one-by-one method, high-throughput method can synthesize and characterize a large number of compositionally varying samples, which is able to accelerate the pace of discovery, development and optimization process of materials. Because of rapid progress in thin film and automatic control technologies, thousands of compounds with different compositions could be synthesized rapidly right now, even in a single experiment. However, the lack of rapid or combinatorial characterization technologies to match with high-throughput synthesis methods, limit the application of high-throughput technology. Here, we review a series of representative high-throughput characterization methods used in lithium batteries, including high-throughput structural and electrochemical characterization methods and rapid measuring technologies based on synchrotron light sources.

  7. High-throughput full-automatic synchrotron-based tomographic microscopy

    International Nuclear Information System (INIS)

    Mader, Kevin; Marone, Federica; Hintermueller, Christoph; Mikuljan, Gordan; Isenegger, Andreas; Stampanoni, Marco

    2011-01-01

    At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline of the Swiss Light Source with an energy range of 8-45 keV and voxel size from 0.37 (micro)m to 7.4 (micro)m, full tomographic datasets are typically acquired in 5 to 10 min. To exploit the speed of the system and enable high-throughput studies to be performed in a fully automatic manner, a package of automation tools has been developed. The samples are automatically exchanged, aligned, moved to the correct region of interest, and scanned. This task is accomplished through the coordination of Python scripts, a robot-based sample-exchange system, sample positioning motors and a CCD camera. The tools are suited for any samples that can be mounted on a standard SEM stub, and require no specific environmental conditions. Up to 60 samples can be analyzed at a time without user intervention. The throughput of the system is dependent on resolution, energy and sample size, but rates of four samples per hour have been achieved with 0.74 (micro)m voxel size at 17.5 keV. The maximum intervention-free scanning time is theoretically unlimited, and in practice experiments have been running unattended as long as 53 h (the average beam time allocation at TOMCAT is 48 h per user). The system is the first fully automated high-throughput tomography station: mounting samples, finding regions of interest, scanning and reconstructing can be performed without user intervention. The system also includes many features which accelerate and simplify the process of tomographic microscopy.

  8. Classification of solid wastes as non-radioactive wastes

    International Nuclear Information System (INIS)

    Suzuki, Masahiro; Tomioka, Hideo; Kamike, Kozo; Komatu, Junji

    1995-01-01

    The radioactive wastes generally include nuclear fuels, materials contaminated with radioactive contaminants or neutron activation to be discarded. The solid wastes arising from the radiation control area in nuclear facilities are used to treat and stored as radioactive solid wastes at the operation of nuclear facilities in Japan. However, these wastes include many non-radioactive wastes. Especially, a large amount of wastes is expected to generate at the decommissioning of nuclear facilities in the near future. It is important to classify these wastes into non-radioactive and radioactive wastes. The exemption or recycling criteria of radioactive solid wastes is under discussion and not decided yet in Japan. Under these circumstances, the Nuclear Safety Committee recently decided the concept on the category of non-radioactive waste for the wastes arising from decommissioning of nuclear facilities. The concept is based on the separation and removal of the radioactively contaminated parts from radioactive solid wastes. The residual parts of these solid wastes will be treated as non-radioactive waste if no significant difference in radioactivity between the similar natural materials and materials removed the radioactive contaminants. The paper describes the procedures of classification of solid wastes as non-radioactive wastes. (author)

  9. A high-throughput surface plasmon resonance biosensor based on differential interferometric imaging

    International Nuclear Information System (INIS)

    Wang, Daqian; Ding, Lili; Zhang, Wei; Zhang, Enyao; Yu, Xinglong; Luo, Zhaofeng; Ou, Huichao

    2012-01-01

    A new high-throughput surface plasmon resonance (SPR) biosensor based on differential interferometric imaging is reported. The two SPR interferograms of the sensing surface are imaged on two CCD cameras. The phase difference between the two interferograms is 180°. The refractive index related factor (RIRF) of the sensing surface is calculated from the two simultaneously acquired interferograms. The simulation results indicate that the RIRF exhibits a linear relationship with the refractive index of the sensing surface and is unaffected by the noise, drift and intensity distribution of the light source. The affinity and kinetic information can be extracted in real time from continuously acquired RIRF distributions. The results of refractometry experiments show that the dynamic detection range of SPR differential interferometric imaging system can be over 0.015 refractive index unit (RIU). High refractive index resolution is down to 0.45 RU (1 RU = 1 × 10 −6 RIU). Imaging and protein microarray experiments demonstrate the ability of high-throughput detection. The aptamer experiments demonstrate that the SPR sensor based on differential interferometric imaging has a great capability to be implemented for high-throughput aptamer kinetic evaluation. These results suggest that this biosensor has the potential to be utilized in proteomics and drug discovery after further improvement. (paper)

  10. Novel high-throughput cell-based hybridoma screening methodology using the Celigo Image Cytometer.

    Science.gov (United States)

    Zhang, Haohai; Chan, Leo Li-Ying; Rice, William; Kassam, Nasim; Longhi, Maria Serena; Zhao, Haitao; Robson, Simon C; Gao, Wenda; Wu, Yan

    2017-08-01

    Hybridoma screening is a critical step for antibody discovery, which necessitates prompt identification of potential clones from hundreds to thousands of hybridoma cultures against the desired immunogen. Technical issues associated with ELISA- and flow cytometry-based screening limit accuracy and diminish high-throughput capability, increasing time and cost. Conventional ELISA screening with coated antigen is also impractical for difficult-to-express hydrophobic membrane antigens or multi-chain protein complexes. Here, we demonstrate novel high-throughput screening methodology employing the Celigo Image Cytometer, which avoids nonspecific signals by contrasting antibody binding signals directly on living cells, with and without recombinant antigen expression. The image cytometry-based high-throughput screening method was optimized by detecting the binding of hybridoma supernatants to the recombinant antigen CD39 expressed on Chinese hamster ovary (CHO) cells. Next, the sensitivity of the image cytometer was demonstrated by serial dilution of purified CD39 antibody. Celigo was used to measure antibody affinities of commercial and in-house antibodies to membrane-bound CD39. This cell-based screening procedure can be completely accomplished within one day, significantly improving throughput and efficiency of hybridoma screening. Furthermore, measuring direct antibody binding to living cells eliminated both false positive and false negative hits. The image cytometry method was highly sensitive and versatile, and could detect positive antibody in supernatants at concentrations as low as ~5ng/mL, with concurrent K d binding affinity coefficient determination. We propose that this screening method will greatly facilitate antibody discovery and screening technologies. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. High Throughput WAN Data Transfer with Hadoop-based Storage

    Science.gov (United States)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  12. High Throughput WAN Data Transfer with Hadoop-based Storage

    International Nuclear Information System (INIS)

    Amin, A; Thomas, M; Bockelman, B; Letts, J; Martin, T; Pi, H; Sfiligoi, I; Wüerthwein, F; Levshina, T

    2011-01-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  13. A continuous high-throughput bioparticle sorter based on 3D traveling-wave dielectrophoresis.

    Science.gov (United States)

    Cheng, I-Fang; Froude, Victoria E; Zhu, Yingxi; Chang, Hsueh-Chia; Chang, Hsien-Chang

    2009-11-21

    We present a high throughput (maximum flow rate approximately 10 microl/min or linear velocity approximately 3 mm/s) continuous bio-particle sorter based on 3D traveling-wave dielectrophoresis (twDEP) at an optimum AC frequency of 500 kHz. The high throughput sorting is achieved with a sustained twDEP particle force normal to the continuous through-flow, which is applied over the entire chip by a single 3D electrode array. The design allows continuous fractionation of micron-sized particles into different downstream sub-channels based on differences in their twDEP mobility on both sides of the cross-over. Conventional DEP is integrated upstream to focus the particles into a single levitated queue to allow twDEP sorting by mobility difference and to minimize sedimentation and field-induced lysis. The 3D electrode array design minimizes the offsetting effect of nDEP (negative DEP with particle force towards regions with weak fields) on twDEP such that both forces increase monotonically with voltage to further increase the throughput. Effective focusing and separation of red blood cells from debris-filled heterogeneous samples are demonstrated, as well as size-based separation of poly-dispersed liposome suspensions into two distinct bands at 2.3 to 4.6 microm and 1.5 to 2.7 microm, at the highest throughput recorded in hand-held chips of 6 microl/min.

  14. SeqAPASS to evaluate conservation of high-throughput screening targets across non-mammalian species

    Science.gov (United States)

    Cell-based high-throughput screening (HTS) and computational technologies are being applied as tools for toxicity testing in the 21st century. The U.S. Environmental Protection Agency (EPA) embraced these technologies and created the ToxCast Program in 2007, which has served as a...

  15. A High-Throughput Antibody-Based Microarray Typing Platform

    Directory of Open Access Journals (Sweden)

    Ashan Perera

    2013-05-01

    Full Text Available Many rapid methods have been developed for screening foods for the presence of pathogenic microorganisms. Rapid methods that have the additional ability to identify microorganisms via multiplexed immunological recognition have the potential for classification or typing of microbial contaminants thus facilitating epidemiological investigations that aim to identify outbreaks and trace back the contamination to its source. This manuscript introduces a novel, high throughput typing platform that employs microarrayed multiwell plate substrates and laser-induced fluorescence of the nucleic acid intercalating dye/stain SYBR Gold for detection of antibody-captured bacteria. The aim of this study was to use this platform for comparison of different sets of antibodies raised against the same pathogens as well as demonstrate its potential effectiveness for serotyping. To that end, two sets of antibodies raised against each of the “Big Six” non-O157 Shiga toxin-producing E. coli (STEC as well as E. coli O157:H7 were array-printed into microtiter plates, and serial dilutions of the bacteria were added and subsequently detected. Though antibody specificity was not sufficient for the development of an STEC serotyping method, the STEC antibody sets performed reasonably well exhibiting that specificity increased at lower capture antibody concentrations or, conversely, at lower bacterial target concentrations. The favorable results indicated that with sufficiently selective and ideally concentrated sets of biorecognition elements (e.g., antibodies or aptamers, this high-throughput platform can be used to rapidly type microbial isolates derived from food samples within ca. 80 min of total assay time. It can also potentially be used to detect the pathogens from food enrichments and at least serve as a platform for testing antibodies.

  16. Distamycin A/DAPI bands and the effects of 5-azacytidine on the chromosomes of the chimpanzee, Pan troglodytes.

    Science.gov (United States)

    Schmid, M; Haaf, T

    1984-01-01

    The chromosomes of the chimpanzee were stained with distamycin A/DAPI, which labels specific C-bands. Bright distamycin A/DAPI fluorescence was found in the heterochromatic regions of chromosomes 6, 11, 14 to 16, 18 to 20, and 23 and the Y. Lymphocyte cultures from chimpanzees were treated with low doses of 5-azacytidine during the last hours of culture. This cytosine analog induces highly distinct undercondensations in 28 heterochromatic regions of 19 chromosomes. These 5-azacytidine-sensitive regions are predominantly located in the terminal C-bands of the chromosomes. In vitro treatment with 5-azacytidine also preserves into the metaphase stage somatic pairings between the 5-azacytidine-sensitive heterochromatic regions in interphase nuclei. The homologies and differences regarding the chromosomal localization of distamycin A/DAPI-bright C-bands, 5-azacytidine-sensitive heterochromatin, 5-methylcytosine-rich DNA sequences, and satellite DNAs in the chimpanzee and man are discussed.

  17. ASSESSMENT OF RADIOACTIVE AND NON-RADIOACTIVE CONTAMINANTS FOUND IN LOW LEVEL RADIOACTIVE WASTE STREAMS

    International Nuclear Information System (INIS)

    R.H. Little, P.R. Maul, J.S.S. Penfoldag

    2003-01-01

    This paper describes and presents the findings from two studies undertaken for the European Commission to assess the long-term impact upon the environment and human health of non-radioactive contaminants found in various low level radioactive waste streams. The initial study investigated the application of safety assessment approaches developed for radioactive contaminants to the assessment of nonradioactive contaminants in low level radioactive waste. It demonstrated how disposal limits could be derived for a range of non-radioactive contaminants and generic disposal facilities. The follow-up study used the same approach but undertook more detailed, disposal system specific calculations, assessing the impacts of both the non-radioactive and radioactive contaminants. The calculations undertaken indicated that it is prudent to consider non-radioactive, as well as radioactive contaminants, when assessing the impacts of low level radioactive waste disposal. For some waste streams with relatively low concentrations of radionuclides, the potential post-closure disposal impacts from non-radioactive contaminants can be comparable with the potential radiological impacts. For such waste streams there is therefore an added incentive to explore options for recycling the materials involved wherever possible

  18. Accurate CpG and non-CpG cytosine methylation analysis by high-throughput locus-specific pyrosequencing in plants.

    Science.gov (United States)

    How-Kit, Alexandre; Daunay, Antoine; Mazaleyrat, Nicolas; Busato, Florence; Daviaud, Christian; Teyssier, Emeline; Deleuze, Jean-François; Gallusci, Philippe; Tost, Jörg

    2015-07-01

    Pyrosequencing permits accurate quantification of DNA methylation of specific regions where the proportions of the C/T polymorphism induced by sodium bisulfite treatment of DNA reflects the DNA methylation level. The commercially available high-throughput locus-specific pyrosequencing instruments allow for the simultaneous analysis of 96 samples, but restrict the DNA methylation analysis to CpG dinucleotide sites, which can be limiting in many biological systems. In contrast to mammals where DNA methylation occurs nearly exclusively on CpG dinucleotides, plants genomes harbor DNA methylation also in other sequence contexts including CHG and CHH motives, which cannot be evaluated by these pyrosequencing instruments due to software limitations. Here, we present a complete pipeline for accurate CpG and non-CpG cytosine methylation analysis at single base-resolution using high-throughput locus-specific pyrosequencing. The devised approach includes the design and validation of PCR amplification on bisulfite-treated DNA and pyrosequencing assays as well as the quantification of the methylation level at every cytosine from the raw peak intensities of the Pyrograms by two newly developed Visual Basic Applications. Our method presents accurate and reproducible results as exemplified by the cytosine methylation analysis of the promoter regions of two Tomato genes (NOR and CNR) encoding transcription regulators of fruit ripening during different stages of fruit development. Our results confirmed a significant and temporally coordinated loss of DNA methylation on specific cytosines during the early stages of fruit development in both promoters as previously shown by WGBS. The manuscript describes thus the first high-throughput locus-specific DNA methylation analysis in plants using pyrosequencing.

  19. Consequences of Stoichiometric Error on Nuclear DNA Content Evaluation in Coffea liberica var. dewevrei using DAPI and Propidium Iodide

    OpenAIRE

    NOIROT, MICHEL; BARRE, PHILIPPE; LOUARN, JACQUES; DUPERRAY, CHRISTOPHE; HAMON, SERGE

    2002-01-01

    The genome size of coffee trees (Coffea sp.) was assessed using flow cytometry. Nuclear DNA was stained with two dyes [4′,6‐diamino‐2‐phenylindole dihydrochloride hydrate (DAPI) and propidium iodide (PI)]. Fluorescence in coffee tree nuclei (C‐PI or C‐DAPI) was compared with that of the standard, petunia (P‐PI or P‐DAPI). If there is no stoichiometric error, then the ratio between fluorescence of the target nuclei and that of the standard nuclei (R‐PI or R‐DAPI) is expected to be proportional...

  20. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    International Nuclear Information System (INIS)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan; Durmus, Naside Gozde

    2011-01-01

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  1. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    Energy Technology Data Exchange (ETDEWEB)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan [Department of Medicine, Demirci Bio-Acoustic-MEMS in Medicine (BAMM) Laboratory, Center for Biomedical Engineering, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Durmus, Naside Gozde, E-mail: udemirci@rics.bwh.harvard.edu [School of Engineering and Division of Biology and Medicine, Brown University, Providence, RI (United States)

    2011-09-15

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  2. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  3. Optimization and high-throughput screening of antimicrobial peptides.

    Science.gov (United States)

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  4. A cell-based high-throughput screening assay for radiation susceptibility using automated cell counting

    International Nuclear Information System (INIS)

    Hodzic, Jasmina; Dingjan, Ilse; Maas, Mariëlle JP; Meulen-Muileman, Ida H van der; Menezes, Renee X de; Heukelom, Stan; Verheij, Marcel; Gerritsen, Winald R; Geldof, Albert A; Triest, Baukelien van; Beusechem, Victor W van

    2015-01-01

    Radiotherapy is one of the mainstays in the treatment for cancer, but its success can be limited due to inherent or acquired resistance. Mechanisms underlying radioresistance in various cancers are poorly understood and available radiosensitizers have shown only modest clinical benefit. There is thus a need to identify new targets and drugs for more effective sensitization of cancer cells to irradiation. Compound and RNA interference high-throughput screening technologies allow comprehensive enterprises to identify new agents and targets for radiosensitization. However, the gold standard assay to investigate radiosensitivity of cancer cells in vitro, the colony formation assay (CFA), is unsuitable for high-throughput screening. We developed a new high-throughput screening method for determining radiation susceptibility. Fast and uniform irradiation of batches up to 30 microplates was achieved using a Perspex container and a clinically employed linear accelerator. The readout was done by automated counting of fluorescently stained nuclei using the Acumen eX3 laser scanning cytometer. Assay performance was compared to that of the CFA and the CellTiter-Blue homogeneous uniform-well cell viability assay. The assay was validated in a whole-genome siRNA library screening setting using PC-3 prostate cancer cells. On 4 different cancer cell lines, the automated cell counting assay produced radiation dose response curves that followed a linear-quadratic equation and that exhibited a better correlation to the results of the CFA than did the cell viability assay. Moreover, the cell counting assay could be used to detect radiosensitization by silencing DNA-PKcs or by adding caffeine. In a high-throughput screening setting, using 4 Gy irradiated and control PC-3 cells, the effects of DNA-PKcs siRNA and non-targeting control siRNA could be clearly discriminated. We developed a simple assay for radiation susceptibility that can be used for high-throughput screening. This will aid

  5. High-throughput single nucleotide polymorphism genotyping using nanofluidic Dynamic Arrays

    Directory of Open Access Journals (Sweden)

    Crenshaw Andrew

    2009-01-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs have emerged as the genetic marker of choice for mapping disease loci and candidate gene association studies, because of their high density and relatively even distribution in the human genomes. There is a need for systems allowing medium multiplexing (ten to hundreds of SNPs with high throughput, which can efficiently and cost-effectively generate genotypes for a very large sample set (thousands of individuals. Methods that are flexible, fast, accurate and cost-effective are urgently needed. This is also important for those who work on high throughput genotyping in non-model systems where off-the-shelf assays are not available and a flexible platform is needed. Results We demonstrate the use of a nanofluidic Integrated Fluidic Circuit (IFC - based genotyping system for medium-throughput multiplexing known as the Dynamic Array, by genotyping 994 individual human DNA samples on 47 different SNP assays, using nanoliter volumes of reagents. Call rates of greater than 99.5% and call accuracies of greater than 99.8% were achieved from our study, which demonstrates that this is a formidable genotyping platform. The experimental set up is very simple, with a time-to-result for each sample of about 3 hours. Conclusion Our results demonstrate that the Dynamic Array is an excellent genotyping system for medium-throughput multiplexing (30-300 SNPs, which is simple to use and combines rapid throughput with excellent call rates, high concordance and low cost. The exceptional call rates and call accuracy obtained may be of particular interest to those working on validation and replication of genome- wide- association (GWA studies.

  6. Defining the taxonomic domain of applicability for mammalian-based high-throughput screening assays

    Science.gov (United States)

    Cell-based high throughput screening (HTS) technologies are becoming mainstream in chemical safety evaluations. The US Environmental Protection Agency (EPA) Toxicity Forecaster (ToxCastTM) and the multi-agency Tox21 Programs have been at the forefront in advancing this science, m...

  7. Caveats and limitations of plate reader-based high-throughput kinetic measurements of intracellular calcium levels

    International Nuclear Information System (INIS)

    Heusinkveld, Harm J.; Westerink, Remco H.S.

    2011-01-01

    Calcium plays a crucial role in virtually all cellular processes, including neurotransmission. The intracellular Ca 2+ concentration ([Ca 2+ ] i ) is therefore an important readout in neurotoxicological and neuropharmacological studies. Consequently, there is an increasing demand for high-throughput measurements of [Ca 2+ ] i , e.g. using multi-well microplate readers, in hazard characterization, human risk assessment and drug development. However, changes in [Ca 2+ ] i are highly dynamic, thereby creating challenges for high-throughput measurements. Nonetheless, several protocols are now available for real-time kinetic measurement of [Ca 2+ ] i in plate reader systems, though the results of such plate reader-based measurements have been questioned. In view of the increasing use of plate reader systems for measurements of [Ca 2+ ] i a careful evaluation of current technologies is warranted. We therefore performed an extensive set of experiments, using two cell lines (PC12 and B35) and two fluorescent calcium-sensitive dyes (Fluo-4 and Fura-2), for comparison of a linear plate reader system with single cell fluorescence microscopy. Our data demonstrate that the use of plate reader systems for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with many pitfalls and limitations, including erroneous sustained increases in fluorescence, limited sensitivity and lack of single cell resolution. Additionally, our data demonstrate that probenecid, which is often used to prevent dye leakage, effectively inhibits the depolarization-evoked increase in [Ca 2+ ] i . Overall, the data indicate that the use of current plate reader-based strategies for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with caveats and limitations that require further investigation. - Research highlights: → The use of plate readers for high-throughput screening of intracellular Ca 2+ is associated with many pitfalls and limitations. → Single cell

  8. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  9. Throughput rate study

    International Nuclear Information System (INIS)

    Ford, L.; Bailey, W.; Gottlieb, P.; Emami, F.; Fleming, M.; Robertson, D.

    1993-01-01

    The Civilian Radioactive Waste Management System (CRWMS) Management and Operating (M ampersand O) Contractor, has completed a study to analyze system wide impacts of operating the CRWMS at varying throughput rates, including the 3000 MTU/year rate which has been assumed in the past. Impacts of throughput rate on all phases of the CRWMS operations (acceptance, transportation, storage and disposal) were evaluated. The results of the study indicate that a range from 3000 to 5000 MTU/year is preferred, based on system cost per MTU of SNF emplaced and logistics constraints

  10. Electron capture detector based on a non-radioactive electron source: operating parameters vs. analytical performance

    Directory of Open Access Journals (Sweden)

    E. Bunert

    2017-12-01

    Full Text Available Gas chromatographs with electron capture detectors are widely used for the analysis of electron affine substances such as pesticides or chlorofluorocarbons. With detection limits in the low pptv range, electron capture detectors are the most sensitive detectors available for such compounds. Based on their operating principle, they require free electrons at atmospheric pressure, which are usually generated by a β− decay. However, the use of radioactive materials leads to regulatory restrictions regarding purchase, operation, and disposal. Here, we present a novel electron capture detector based on a non-radioactive electron source that shows similar detection limits compared to radioactive detectors but that is not subject to these limitations and offers further advantages such as adjustable electron densities and energies. In this work we show first experimental results using 1,1,2-trichloroethane and sevoflurane, and investigate the effect of several operating parameters on the analytical performance of this new non-radioactive electron capture detector (ECD.

  11. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    OpenAIRE

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-01-01

    Abstract The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated p...

  12. High throughput imaging cytometer with acoustic focussing.

    Science.gov (United States)

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  13. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    Science.gov (United States)

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  14. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  15. Affinity selection-mass spectrometry and its emerging application to the high throughput screening of G protein-coupled receptors.

    Science.gov (United States)

    Whitehurst, Charles E; Annis, D Allen

    2008-07-01

    Advances in combinatorial chemistry and genomics have inspired the development of novel affinity selection-based screening techniques that rely on mass spectrometry to identify compounds that preferentially bind to a protein target. Of the many affinity selection-mass spectrometry techniques so far documented, only a few solution-based implementations that separate target-ligand complexes away from unbound ligands persist today as routine high throughput screening platforms. Because affinity selection-mass spectrometry techniques do not rely on radioactive or fluorescent reporters or enzyme activities, they can complement traditional biochemical and cell-based screening assays and enable scientists to screen targets that may not be easily amenable to other methods. In addition, by employing mass spectrometry for ligand detection, these techniques enable high throughput screening of massive library collections of pooled compound mixtures, vastly increasing the chemical space that a target can encounter during screening. Of all drug targets, G protein coupled receptors yield the highest percentage of therapeutically effective drugs. In this manuscript, we present the emerging application of affinity selection-mass spectrometry to the high throughput screening of G protein coupled receptors. We also review how affinity selection-mass spectrometry can be used as an analytical tool to guide receptor purification, and further used after screening to characterize target-ligand binding interactions, enabling the classification of orthosteric and allosteric binders.

  16. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  17. Karyotype analysis of three Solanum plants using combined PI-DAPI ...

    African Journals Online (AJOL)

    The chromosomes were distinguished by combined PI-DAPI (CPD) staining and double fluorescence in situ hybridization (FISH) with 45S and 5S rDNA probes and their molecular cytogenetic karyotypes were established. Although, the karyotype of S. surattense Burm. and S. photeinocarpum Nakam was first established, ...

  18. Risk-based high-throughput chemical screening and prioritization using exposure models and in vitro bioactivity assays

    DEFF Research Database (Denmark)

    Shin, Hyeong-Moo; Ernstoff, Alexi; Arnot, Jon

    2015-01-01

    We present a risk-based high-throughput screening (HTS) method to identify chemicals for potential health concerns or for which additional information is needed. The method is applied to 180 organic chemicals as a case study. We first obtain information on how the chemical is used and identify....../oral contact, or dermal exposure. The method provides high-throughput estimates of exposure and important input for decision makers to identify chemicals of concern for further evaluation with additional information or more refined models....

  19. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  20. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods. - Highlights: • A detailed discussion on the template-matching algorithm was given. • The algorithm was tested on data from a NaI and a Si detector. • The algorithm was successfully implemented on high rate data from a HPGe detector. • The performance of the algorithm was compared with traditional shaping methods. • The advantage of the algorithm in active interrogation was discussed.

  1. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  2. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    Science.gov (United States)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  3. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  4. Proton radioactivity at non-collective prolate shape in high spin state of 94Ag

    International Nuclear Information System (INIS)

    Aggarwal, Mamta

    2010-01-01

    We predict proton radioactivity and structural transitions in high spin state of an excited exotic nucleus near proton drip line in a theoretical framework and investigate the nature and the consequences of the structural transitions on separation energy as a function of temperature and spin. It reveals that the rotation of the excited exotic nucleus 94 Ag at excitation energies around 6.7 MeV and angular momentum near 21h generates a rarely seen prolate non-collective shape and proton separation energy becomes negative which indicates proton radioactivity in agreement with the experimental results of Mukha et al. for 94 Ag.

  5. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  6. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    Science.gov (United States)

    Putt, Karson S; Pugh, Randall B

    2013-01-01

    Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  7. Non-invasive high throughput approach for protein hydrophobicity determination based on surface tension.

    Science.gov (United States)

    Amrhein, Sven; Bauer, Katharina Christin; Galm, Lara; Hubbuch, Jürgen

    2015-12-01

    The surface hydrophobicity of a protein is an important factor for its interactions in solution and thus the outcome of its production process. Yet most of the methods are not able to evaluate the influence of these hydrophobic interactions under natural conditions. In the present work we have established a high resolution stalagmometric method for surface tension determination on a liquid handling station, which can cope with accuracy as well as high throughput requirements. Surface tensions could be derived with a low sample consumption (800 μL) and a high reproducibility (content. The protein influence on the solutions' surface tension was correlated to the hydrophobicity of lysozyme, human lysozyme, BSA, and α-lactalbumin. Differences in proteins' hydrophobic character depending on pH and species could be resolved. Within this work we have developed a pH dependent hydrophobicity ranking, which was found to be in good agreement with literature. For the studied pH range of 3-9 lysozyme from chicken egg white was identified to be the most hydrophilic. α-lactalbumin at pH 3 exhibited the most pronounced hydrophobic character. The stalagmometric method occurred to outclass the widely used spectrophotometric method with bromophenol blue sodium salt as it gave reasonable results without restrictions on pH and protein species. © 2015 Wiley Periodicals, Inc.

  8. MGI-oriented High-throughput Measurement of Interdiffusion Coefficient Matrices in Ni-based Superalloys

    Directory of Open Access Journals (Sweden)

    TANG Ying

    2017-01-01

    Full Text Available One of the research hotspots in the field of high-temperature alloys was to search the substitutional elements for Re in order to prepare the single-crystal Ni-based superalloys with less or even no Re addition. To find the elements with similar or even lower diffusion coefficients in comparison with that of Re was one of the effective strategies. In multicomponent alloys, the interdiffusivity matrix were used to comprehensively characterize the diffusion ability of any alloying elements. Therefore, accurate determination of the composition-dependant and temperature-dependent interdiffusivities matrices of different elements in γ and γ' phases of Ni-based superalloys was high priority. The paper briefly introduces of the status of the interdiffusivity matrices determination in Ni-based superalloys, and the methods for determining the interdiffusivities in multicomponent alloys, including the traditional Matano-Kirkaldy method and recently proposed numerical inverse method. Because the traditional Matano-Kirkaldy method is of low efficiency, the experimental reports on interdiffusivity matrices in ternary and higher order sub-systems of the Ni-based superalloys were very scarce in the literature. While the numerical inverse method newly proposed in our research group based on Fick's second law can be utilized for high-throughput measurement of accurate interdiffusivity matrices in alloys with any number of components. After that, the successful application of the numerical inverse method in the high-throughput measurement of interdiffusivity matrices in alloys is demonstrated in fcc (γ phase of the ternary Ni-Al-Ta system. Moreover, the validation of the resulting composition-dependant and temperature-dependent interdiffusivity matrices is also comprehensively made. Then, this paper summarizes the recent progress in the measurement of interdiffusivity matrices in γ and γ' phases of a series of core ternary Ni-based superalloys achieved in

  9. Non-radioactive waste management in a Nuclear Energy Research Institution

    Energy Technology Data Exchange (ETDEWEB)

    Furusawa, Helio A.; Martins, Elaine A.J.; Cotrim, Marycel E.B.; Pires, Maria A. F., E-mail: helioaf@ipen.br, E-mail: elaine@ipen.br, E-mail: mecotrim@ipen.br, E-mail: mapires@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEM-SP), Sao Paulo, SP (Brazil). Centro de Quimica e Meio Ambiente

    2013-07-01

    For more than 50 years, non-radioactive materials have been used in processes at IPEN to support the nuclear fuel development and all related activities. Reagents, raw materials, products and by-products have been stored. Many of these are hazardous highly toxic or reactants materials. Some years ago actions sent part of these non-radioactive waste materials to proper disposal (technical incineration) resulting in an Institutional Non-Radioactive Waste Management Program. In 2005, an internal set of procedures and information entitled - Guia de Procedimentos para Armazenamento, Tratamento e Descarte de Residuos de Laboratorio Quimico - (Guide of Procedures for Storage, Treatment, and Disposal of Chemistry Laboratory Wastes) - was published to be used at the IPEN's facilities. A data base managed by software was created in order to allow the Units to input data and information about the routinely generated wastes and those already existing. Even after disposing so huge amount of wastes, a latent demand still exists. Several goals were achieved notably a well-organized and roomy space; safer storage places; local, state, and nationwide laws enforcement (for radioactive and non-radioactive materials); and improvement in chemicals control as hazardous and aged materials are more frequently disposed. A special stress was conducted to know and follow laws, regulations, and technical norms as the entire process is very detailed and this is not a day-by-day routine for the IPEN's technical personnel. The immediate consequence is that the safer the workplace the safer the nuclear related activities are done. (author)

  10. Non-radioactive waste management in a Nuclear Energy Research Institution

    International Nuclear Information System (INIS)

    Furusawa, Helio A.; Martins, Elaine A.J.; Cotrim, Marycel E.B.; Pires, Maria A. F.

    2013-01-01

    For more than 50 years, non-radioactive materials have been used in processes at IPEN to support the nuclear fuel development and all related activities. Reagents, raw materials, products and by-products have been stored. Many of these are hazardous highly toxic or reactants materials. Some years ago actions sent part of these non-radioactive waste materials to proper disposal (technical incineration) resulting in an Institutional Non-Radioactive Waste Management Program. In 2005, an internal set of procedures and information entitled - Guia de Procedimentos para Armazenamento, Tratamento e Descarte de Residuos de Laboratorio Quimico - (Guide of Procedures for Storage, Treatment, and Disposal of Chemistry Laboratory Wastes) - was published to be used at the IPEN's facilities. A data base managed by software was created in order to allow the Units to input data and information about the routinely generated wastes and those already existing. Even after disposing so huge amount of wastes, a latent demand still exists. Several goals were achieved notably a well-organized and roomy space; safer storage places; local, state, and nationwide laws enforcement (for radioactive and non-radioactive materials); and improvement in chemicals control as hazardous and aged materials are more frequently disposed. A special stress was conducted to know and follow laws, regulations, and technical norms as the entire process is very detailed and this is not a day-by-day routine for the IPEN's technical personnel. The immediate consequence is that the safer the workplace the safer the nuclear related activities are done. (author)

  11. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought

    Directory of Open Access Journals (Sweden)

    Riccardo Ludovisi

    2017-09-01

    Full Text Available Poplars are fast-growing, high-yielding forest tree species, whose cultivation as second-generation biofuel crops is of increasing interest and can efficiently meet emission reduction goals. Yet, breeding elite poplar trees for drought resistance remains a major challenge. Worldwide breeding programs are largely focused on intra/interspecific hybridization, whereby Populus nigra L. is a fundamental parental pool. While high-throughput genotyping has resulted in unprecedented capabilities to rapidly decode complex genetic architecture of plant stress resistance, linking genomics to phenomics is hindered by technically challenging phenotyping. Relying on unmanned aerial vehicle (UAV-based remote sensing and imaging techniques, high-throughput field phenotyping (HTFP aims at enabling highly precise and efficient, non-destructive screening of genotype performance in large populations. To efficiently support forest-tree breeding programs, ground-truthing observations should be complemented with standardized HTFP. In this study, we develop a high-resolution (leaf level HTFP approach to investigate the response to drought of a full-sib F2 partially inbred population (termed here ‘POP6’, whose F1 was obtained from an intraspecific P. nigra controlled cross between genotypes with highly divergent phenotypes. We assessed the effects of two water treatments (well-watered and moderate drought on a population of 4603 trees (503 genotypes hosted in two adjacent experimental plots (1.67 ha by conducting low-elevation (25 m flights with an aerial drone and capturing 7836 thermal infrared (TIR images. TIR images were undistorted, georeferenced, and orthorectified to obtain radiometric mosaics. Canopy temperature (Tc was extracted using two independent semi-automated segmentation techniques, eCognition- and Matlab-based, to avoid the mixed-pixel problem. Overall, results showed that the UAV platform-based thermal imaging enables to effectively assess genotype

  12. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought.

    Science.gov (United States)

    Ludovisi, Riccardo; Tauro, Flavia; Salvati, Riccardo; Khoury, Sacha; Mugnozza Scarascia, Giuseppe; Harfouche, Antoine

    2017-01-01

    Poplars are fast-growing, high-yielding forest tree species, whose cultivation as second-generation biofuel crops is of increasing interest and can efficiently meet emission reduction goals. Yet, breeding elite poplar trees for drought resistance remains a major challenge. Worldwide breeding programs are largely focused on intra/interspecific hybridization, whereby Populus nigra L. is a fundamental parental pool. While high-throughput genotyping has resulted in unprecedented capabilities to rapidly decode complex genetic architecture of plant stress resistance, linking genomics to phenomics is hindered by technically challenging phenotyping. Relying on unmanned aerial vehicle (UAV)-based remote sensing and imaging techniques, high-throughput field phenotyping (HTFP) aims at enabling highly precise and efficient, non-destructive screening of genotype performance in large populations. To efficiently support forest-tree breeding programs, ground-truthing observations should be complemented with standardized HTFP. In this study, we develop a high-resolution (leaf level) HTFP approach to investigate the response to drought of a full-sib F 2 partially inbred population (termed here 'POP6'), whose F 1 was obtained from an intraspecific P. nigra controlled cross between genotypes with highly divergent phenotypes. We assessed the effects of two water treatments (well-watered and moderate drought) on a population of 4603 trees (503 genotypes) hosted in two adjacent experimental plots (1.67 ha) by conducting low-elevation (25 m) flights with an aerial drone and capturing 7836 thermal infrared (TIR) images. TIR images were undistorted, georeferenced, and orthorectified to obtain radiometric mosaics. Canopy temperature ( T c ) was extracted using two independent semi-automated segmentation techniques, eCognition- and Matlab-based, to avoid the mixed-pixel problem. Overall, results showed that the UAV platform-based thermal imaging enables to effectively assess genotype

  13. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  14. A DVD-ROM based high-throughput cantilever sensing platform

    DEFF Research Database (Denmark)

    Bosco, Filippo

    and October 2011. The project was part of the Xsense research network, funded by the Strategic Danish Research Council, and supervised by Prof. Anja Boisen. The goal of the Xsense project is to design and fabricate a compact and cheap device for explosive sensing in air and liquid. Four different technologies...... of a high-throughput label-free sensor platform utilizing cantilever based sensors. These sensors have often been acclaimed to facilitate highly parallelized operation. Unfortunately, so far no concept has been presented which offers large data sets as well as easy liquid sample handling. We use optics...... and mechanics from a DVD player to handle liquid samples and to read-out cantilever deflection and resonant frequency. In a few minutes, several liquid samples can be analyzed in parallel, measuring over several hundreds of individual cantilevers. Three generations of systems have been developed and tested...

  15. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  16. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    Directory of Open Access Journals (Sweden)

    Karson S Putt

    Full Text Available Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  17. High-Throughput Network Communication with NetIO

    CERN Document Server

    Schumacher, J\\"orn; The ATLAS collaboration; Vandelli, Wainer

    2016-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs (and this has been done), but it requires a non negligible effort and expert knowledge. On the other hand, message services like 0MQ have gained popularity in the HEP community. Such APIs allow to build distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on to...

  18. A high-throughput colorimetric assay for glucose detection based on glucose oxidase-catalyzed enlargement of gold nanoparticles

    Science.gov (United States)

    Xiong, Yanmei; Zhang, Yuyan; Rong, Pengfei; Yang, Jie; Wang, Wei; Liu, Dingbin

    2015-09-01

    We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose.We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose. Electronic supplementary information (ESI) available: Experimental section and additional figures. See DOI: 10.1039/c5nr03758a

  19. Survey on non-nuclear radioactive waste

    International Nuclear Information System (INIS)

    2003-11-01

    On request from the Swedish Radiation Protection Authority, the Swedish government has in May 2002 set up a non-standing committee for non-nuclear radioactive waste. The objective was to elaborate proposals for a national system for the management of all types of non-nuclear radioactive wastes with special consideration of inter alia the polluter pays principle and the responsibility of the producers. The committee will deliver its proposals to the government 1 December 2003. SSI has assisted the committee to the necessary extent to fulfill the investigation. This report is a summery of SSI's background material concerning non-nuclear radioactive waste in Sweden

  20. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    Science.gov (United States)

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  1. High-resolution and high-throughput multichannel Fourier transform spectrometer with two-dimensional interferogram warping compensation

    Science.gov (United States)

    Watanabe, A.; Furukawa, H.

    2018-04-01

    The resolution of multichannel Fourier transform (McFT) spectroscopy is insufficient for many applications despite its extreme advantage of high throughput. We propose an improved configuration to realise both performance using a two-dimensional area sensor. For the spectral resolution, we obtained the interferogram of a larger optical path difference by shifting the area sensor without altering any optical components. The non-linear phase error of the interferometer was successfully corrected using a phase-compensation calculation. Warping compensation was also applied to realise a higher throughput to accumulate the signal between vertical pixels. Our approach significantly improved the resolution and signal-to-noise ratio by factors of 1.7 and 34, respectively. This high-resolution and high-sensitivity McFT spectrometer will be useful for detecting weak light signals such as those in non-invasive diagnosis.

  2. PRIMEGENSw3: a web-based tool for high-throughput primer and probe design.

    Science.gov (United States)

    Kushwaha, Garima; Srivastava, Gyan Prakash; Xu, Dong

    2015-01-01

    Highly specific and efficient primer and probe design has been a major hurdle in many high-throughput techniques. Successful implementation of any PCR or probe hybridization technique depends on the quality of primers and probes used in terms of their specificity and cross-hybridization. Here we describe PRIMEGENSw3, a set of web-based utilities for high-throughput primer and probe design. These utilities allow users to select genomic regions and to design primer/probe for selected regions in an interactive, user-friendly, and automatic fashion. The system runs the PRIMEGENS algorithm in the back-end on the high-performance server with the stored genomic database or user-provided custom database for cross-hybridization check. Cross-hybridization is checked not only using BLAST but also by checking mismatch positions and energy calculation of potential hybridization hits. The results can be visualized online and also can be downloaded. The average success rate of primer design using PRIMEGENSw3 is ~90 %. The web server also supports primer design for methylated sequences, which is used in epigenetic studies. Stand-alone version of the software is also available for download at the website.

  3. Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.

    Science.gov (United States)

    Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai

    2018-03-01

    With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.

  4. Throughput-Based Traffic Steering in LTE-Advanced HetNet Deployments

    DEFF Research Database (Denmark)

    Gimenez, Lucas Chavarria; Kovacs, Istvan Z.; Wigard, Jeroen

    2015-01-01

    The objective of this paper is to propose traffic steering solutions that aim at optimizing the end-user throughput. Two different implementations of an active mode throughput-based traffic steering algorithm for Heterogeneous Networks (HetNet) are introduced. One that always forces handover of t...... throughput is generally higher, reaching values of 36% and 18% for the medium- and high-load conditions....

  5. A high-throughput fluorescence resonance energy transfer (FRET)-based endothelial cell apoptosis assay and its application for screening vascular disrupting agents

    International Nuclear Information System (INIS)

    Zhu, Xiaoming; Fu, Afu; Luo, Kathy Qian

    2012-01-01

    Highlights: ► An endothelial cell apoptosis assay using FRET-based biosensor was developed. ► The fluorescence of the cells changed from green to blue during apoptosis. ► This method was developed into a high-throughput assay in 96-well plates. ► This assay was applied to screen vascular disrupting agents. -- Abstract: In this study, we developed a high-throughput endothelial cell apoptosis assay using a fluorescence resonance energy transfer (FRET)-based biosensor. After exposure to apoptotic inducer UV-irradiation or anticancer drugs such as paclitaxel, the fluorescence of the cells changed from green to blue. We developed this method into a high-throughput assay in 96-well plates by measuring the emission ratio of yellow fluorescent protein (YFP) to cyan fluorescent protein (CFP) to monitor the activation of a key protease, caspase-3, during apoptosis. The Z′ factor for this assay was above 0.5 which indicates that this assay is suitable for a high-throughput analysis. Finally, we applied this functional high-throughput assay for screening vascular disrupting agents (VDA) which could induce endothelial cell apoptosis from our in-house compounds library and dioscin was identified as a hit. As this assay allows real time and sensitive detection of cell apoptosis, it will be a useful tool for monitoring endothelial cell apoptosis in living cell situation and for identifying new VDA candidates via a high-throughput screening.

  6. Investigation of radioactive contamination at non-radioactive drains of the Tsuruga Nuclear Power Station

    International Nuclear Information System (INIS)

    Koide, Hiroaki; Imanaka, Tetsuji; Ebisawa, Toru; Kawano, Shinji; Kobayashi, Keiji.

    1982-05-01

    In April, 1981, it was disclosed that a drainage area at the Tsuruga Nuclear Power Station was so much contaminated with radioactivites. Although Ministry of International Trade and Industry (MITI) officially provided an explanation of a process that resulted in the contamination, many problems remain unsolved on account of insufficient and limited investigations. The authors collected mud samples from contaminated manholes and examined radioactivities in them through the measurement of #betta#- and #betta#-spectra. Chemical separation of the samples was carried out in order to obtain precise concentration of radioactive cesium. Results are as follows: i) the concentration of radioactivities does not show monotonous decrease along the stream line but an anomalous peak at downstream manholes, ii) at the manhole specified No. 6 located rather downstream, 137 Cs concentration is significantly high and the composition of radioactive nuclides is quite different from that in the other manholes, and iii) additional radioactive contamination was observed in other manholes of non-radioactive drains which would not be influenced by the accident explained by MITI. Our present work has provided much more data than by MITI and made it clear that the overall data cnnot be consistent with the simple MITI explanation; a single radioactive release accident caused the disclosed contamination. It is concluded that non-radioactive water drains at the Tsuruga Nuclear Power Station had been under continual contamination. (author)

  7. Drug uptake (DAPI) of trypanosomes (T. brucei) and antitrypanosomal activity in vitro, in culture and in vivo studied by microscope fluorometry, chromatogram spectrophotometry and radiotracer techniques

    International Nuclear Information System (INIS)

    Kratzer, R.D.

    1982-01-01

    The present study had the following objectives: 1) Investigation of the specific binding and location of the diamidine DAPI within trypanosomes by fluorescence microscopy. 2) Development and standardization of a microscope fluorometry technique for measuring DAPI uptake of single trypanosomes. 3) Determination of the effect of incubation media, exposure time, and drug concentration on DAPI uptake of single trypanosomes. 4) Development of a technique applicable for quantitative fluorescence chemical analysis of DAPI uptake of trypanosomes. 5) Determination of drug uptake of trypanosomes using 14 C labelled DAPI. 6) Comparison of the values obtained by the three methods. (orig./MG)

  8. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  9. High-Throughput Particle Manipulation Based on Hydrodynamic Effects in Microchannels

    Directory of Open Access Journals (Sweden)

    Chao Liu

    2017-03-01

    Full Text Available Microfluidic techniques are effective tools for precise manipulation of particles and cells, whose enrichment and separation is crucial for a wide range of applications in biology, medicine, and chemistry. Recently, lateral particle migration induced by the intrinsic hydrodynamic effects in microchannels, such as inertia and elasticity, has shown its promise for high-throughput and label-free particle manipulation. The particle migration can be engineered to realize the controllable focusing and separation of particles based on a difference in size. The widespread use of inertial and viscoelastic microfluidics depends on the understanding of hydrodynamic effects on particle motion. This review will summarize the progress in the fundamental mechanisms and key applications of inertial and viscoelastic particle manipulation.

  10. Fluorescence-based high-throughput screening of dicer cleavage activity

    Czech Academy of Sciences Publication Activity Database

    Podolská, Kateřina; Sedlák, David; Bartůněk, Petr; Svoboda, Petr

    2014-01-01

    Roč. 19, č. 3 (2014), s. 417-426 ISSN 1087-0571 R&D Projects: GA ČR GA13-29531S; GA MŠk(CZ) LC06077; GA MŠk LM2011022 Grant - others:EMBO(DE) 1483 Institutional support: RVO:68378050 Keywords : Dicer * siRNA * high-throughput screening Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.423, year: 2014

  11. Development of novel, 384-well high-throughput assay panels for human drug transporters: drug interaction and safety assessment in support of discovery research.

    Science.gov (United States)

    Tang, Huaping; Shen, Ding Ren; Han, Yong-Hae; Kong, Yan; Balimane, Praveen; Marino, Anthony; Gao, Mian; Wu, Sophie; Xie, Dianlin; Soars, Matthew G; O'Connell, Jonathan C; Rodrigues, A David; Zhang, Litao; Cvijic, Mary Ellen

    2013-10-01

    Transporter proteins are known to play a critical role in affecting the overall absorption, distribution, metabolism, and excretion characteristics of drug candidates. In addition to efflux transporters (P-gp, BCRP, MRP2, etc.) that limit absorption, there has been a renewed interest in influx transporters at the renal (OATs, OCTs) and hepatic (OATPs, BSEP, NTCP, etc.) organ level that can cause significant clinical drug-drug interactions (DDIs). Several of these transporters are also critical for hepatobiliary disposition of bilirubin and bile acid/salts, and their inhibition is directly implicated in hepatic toxicities. Regulatory agencies took action to address transporter-mediated DDI with the goal of ensuring drug safety in the clinic and on the market. To meet regulatory requirements, advanced bioassay technology and automation solutions were implemented for high-throughput transporter screening to provide structure-activity relationship within lead optimization. To enhance capacity, several functional assay formats were miniaturized to 384-well throughput including novel fluorescence-based uptake and efflux inhibition assays using high-content image analysis as well as cell-based radioactive uptake and vesicle-based efflux inhibition assays. This high-throughput capability enabled a paradigm shift from studying transporter-related issues in the development space to identifying and dialing out these concerns early on in discovery for enhanced mechanism-based efficacy while circumventing DDIs and transporter toxicities.

  12. Particle size of radioactive aerosols generated during machine operation in high-energy proton accelerators

    International Nuclear Information System (INIS)

    Oki, Yuichi; Kanda, Yukio; Kondo, Kenjiro; Endo, Akira

    2000-01-01

    In high-energy accelerators, non-radioactive aerosols are abundantly generated due to high radiation doses during machine operation. Under such a condition, radioactive atoms, which are produced through various nuclear reactions in the air of accelerator tunnels, form radioactive aerosols. These aerosols might be inhaled by workers who enter the tunnel just after the beam stop. Their particle size is very important information for estimation of internal exposure doses. In this work, focusing on typical radionuclides such as 7 Be and 24 Na, their particle size distributions are studied. An aluminum chamber was placed in the EP2 beam line of the 12-GeV proton synchrotron at High Energy Accelerator Research Organization (KEK). Aerosol-free air was introduced to the chamber, and aerosols formed in the chamber were sampled during machine operation. A screen-type diffusion battery was employed in the aerosol-size analysis. Assuming that the aerosols have log-normal size distributions, their size distributions were obtained from the radioactivity concentrations at the entrance and exit of the diffusion battery. Radioactivity of the aerosols was measured with Ge detector system, and concentrations of non-radioactive aerosols were obtained using condensation particle counter (CPC). The aerosol size (radius) for 7 Be and 24 Na was found to be 0.01-0.04 μm, and was always larger than that for non-radioactive aerosols. The concentration of non-radioactive aerosols was found to be 10 6 - 10 7 particles/cm 3 . The size for radioactive aerosols was much smaller than ordinary atmospheric aerosols. Internal doses due to inhalation of the radioactive aerosols were estimated, based on the respiratory tract model of ICRP Pub. 66. (author)

  13. Structure-based methods to predict mutational resistance to diarylpyrimidine non-nucleoside reverse transcriptase inhibitors.

    Science.gov (United States)

    Azeem, Syeda Maryam; Muwonge, Alecia N; Thakkar, Nehaben; Lam, Kristina W; Frey, Kathleen M

    2018-01-01

    Resistance to non-nucleoside reverse transcriptase inhibitors (NNRTIs) is a leading cause of HIV treatment failure. Often included in antiviral therapy, NNRTIs are chemically diverse compounds that bind an allosteric pocket of enzyme target reverse transcriptase (RT). Several new NNRTIs incorporate flexibility in order to compensate for lost interactions with amino acid conferring mutations in RT. Unfortunately, even successful inhibitors such as diarylpyrimidine (DAPY) inhibitor rilpivirine are affected by mutations in RT that confer resistance. In order to aid drug design efforts, it would be efficient and cost effective to pre-evaluate NNRTI compounds in development using a structure-based computational approach. As proof of concept, we applied a residue scan and molecular dynamics strategy using RT crystal structures to predict mutations that confer resistance to DAPYs rilpivirine, etravirine, and investigational microbicide dapivirine. Our predictive values, changes in affinity and stability, are correlative with fold-resistance data for several RT mutants. Consistent with previous studies, mutation K101P is predicted to confer high-level resistance to DAPYs. These findings were further validated using structural analysis, molecular dynamics, and an enzymatic reverse transcription assay. Our results confirm that changes in affinity and stability for mutant complexes are predictive parameters of resistance as validated by experimental and clinical data. In future work, we believe that this computational approach may be useful to predict resistance mutations for inhibitors in development. Published by Elsevier Inc.

  14. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    Science.gov (United States)

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots

  15. Benchmarking Ligand-Based Virtual High-Throughput Screening with the PubChem Database

    Directory of Open Access Journals (Sweden)

    Mariusz Butkiewicz

    2013-01-01

    Full Text Available With the rapidly increasing availability of High-Throughput Screening (HTS data in the public domain, such as the PubChem database, methods for ligand-based computer-aided drug discovery (LB-CADD have the potential to accelerate and reduce the cost of probe development and drug discovery efforts in academia. We assemble nine data sets from realistic HTS campaigns representing major families of drug target proteins for benchmarking LB-CADD methods. Each data set is public domain through PubChem and carefully collated through confirmation screens validating active compounds. These data sets provide the foundation for benchmarking a new cheminformatics framework BCL::ChemInfo, which is freely available for non-commercial use. Quantitative structure activity relationship (QSAR models are built using Artificial Neural Networks (ANNs, Support Vector Machines (SVMs, Decision Trees (DTs, and Kohonen networks (KNs. Problem-specific descriptor optimization protocols are assessed including Sequential Feature Forward Selection (SFFS and various information content measures. Measures of predictive power and confidence are evaluated through cross-validation, and a consensus prediction scheme is tested that combines orthogonal machine learning algorithms into a single predictor. Enrichments ranging from 15 to 101 for a TPR cutoff of 25% are observed.

  16. Development of a high-throughput real time PCR based on a hot-start alternative for Pfu mediated by quantum dots

    Science.gov (United States)

    Sang, Fuming; Yang, Yang; Yuan, Lin; Ren, Jicun; Zhang, Zhizhou

    2015-09-01

    Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour preincubation at 50 °C before real time PCR. Moreover, the results obtained by QD-based HS PCR were comparable to a commercial Taq antibody DNA polymerase. However, no obvious HS effect of QDs was found in real time PCR using Taq DNA polymerase. The findings of this study demonstrated that a cost-effective high-throughput real time PCR based on QD triggered HS PCR could be established with high consistency, sensitivity and accuracy.Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour

  17. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  18. Proton radioactivity at non-collective prolate shape in high spin state of {sup 94}Ag

    Energy Technology Data Exchange (ETDEWEB)

    Aggarwal, Mamta, E-mail: mamta.a4@gmail.co [UM-DAE Centre for Excellence in Basic Sciences, University of Mumbai, Kalina Campus, Mumbai 400 098 (India)

    2010-10-11

    We predict proton radioactivity and structural transitions in high spin state of an excited exotic nucleus near proton drip line in a theoretical framework and investigate the nature and the consequences of the structural transitions on separation energy as a function of temperature and spin. It reveals that the rotation of the excited exotic nucleus {sup 94}Ag at excitation energies around 6.7 MeV and angular momentum near 21h generates a rarely seen prolate non-collective shape and proton separation energy becomes negative which indicates proton radioactivity in agreement with the experimental results of Mukha et al. for {sup 94}Ag.

  19. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  20. A high-throughput readout architecture based on PCI-Express Gen3 and DirectGMA technology

    International Nuclear Information System (INIS)

    Rota, L.; Vogelgesang, M.; Perez, L.E. Ardila; Caselle, M.; Chilingaryan, S.; Dritschler, T.; Zilio, N.; Kopmann, A.; Balzer, M.; Weber, M.

    2016-01-01

    Modern physics experiments produce multi-GB/s data rates. Fast data links and high performance computing stages are required for continuous data acquisition and processing. Because of their intrinsic parallelism and computational power, GPUs emerged as an ideal solution to process this data in high performance computing applications. In this paper we present a high-throughput platform based on direct FPGA-GPU communication. The architecture consists of a Direct Memory Access (DMA) engine compatible with the Xilinx PCI-Express core, a Linux driver for register access, and high- level software to manage direct memory transfers using AMD's DirectGMA technology. Measurements with a Gen3 x8 link show a throughput of 6.4 GB/s for transfers to GPU memory and 6.6 GB/s to system memory. We also assess the possibility of using the architecture in low latency systems: preliminary measurements show a round-trip latency as low as 1 μs for data transfers to system memory, while the additional latency introduced by OpenCL scheduling is the current limitation for GPU based systems. Our implementation is suitable for real-time DAQ system applications ranging from photon science and medical imaging to High Energy Physics (HEP) systems

  1. A high-throughput liquid bead array-based screening technology for Bt presence in GMO manipulation.

    Science.gov (United States)

    Fu, Wei; Wang, Huiyu; Wang, Chenguang; Mei, Lin; Lin, Xiangmei; Han, Xueqing; Zhu, Shuifang

    2016-03-15

    The number of species and planting areas of genetically modified organisms (GMOs) has been rapidly developed during the past ten years. For the purpose of GMO inspection, quarantine and manipulation, we have now devised a high-throughput Bt-based GMOs screening method based on the liquid bead array. This novel method is based on the direct competitive recognition between biotinylated antibodies and beads-coupled antigens, searching for Bt presence in samples if it contains Bt Cry1 Aa, Bt Cry1 Ab, Bt Cry1 Ac, Bt Cry1 Ah, Bt Cry1 B, Bt Cry1 C, Bt Cry1 F, Bt Cry2 A, Bt Cry3 or Bt Cry9 C. Our method has a wide GMO species coverage so that more than 90% of the whole commercialized GMO species can be identified throughout the world. Under our optimization, specificity, sensitivity, repeatability and availability validation, the method shows a high specificity and 10-50 ng/mL sensitivity of quantification. We then assessed more than 1800 samples in the field and food market to prove capacity of our method in performing a high throughput screening work for GMO manipulation. Our method offers an applicant platform for further inspection and research on GMO plants. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  3. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  4. New high-throughput material-exploration system based on combinatorial chemistry and electrostatic atomization

    International Nuclear Information System (INIS)

    Fujimoto, K.; Takahashi, H.; Ito, S.; Inoue, S.; Watanabe, M.

    2006-01-01

    As a tool to facilitate future material explorations, our group has developed a new combinatorial system for the high-throughput preparation of compounds made up of more than three components. The system works in two steps: the atomization of a liquid by a high electric field followed by deposition to a grounded substrate. The combinatorial system based on this method has plural syringe pumps. The each starting materials are fed through the syringe pumps into a manifold, thoroughly mixed as they pass through the manifold, and atomized from the tip of a stainless steel nozzle onto a grounded substrate

  5. Machine learning in computational biology to accelerate high-throughput protein expression

    DEFF Research Database (Denmark)

    Sastry, Anand; Monk, Jonathan M.; Tegel, Hanna

    2017-01-01

    and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide...... the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. Availability and implementation: We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template...

  6. Identification of adiponectin receptor agonist utilizing a fluorescence polarization based high throughput assay.

    Directory of Open Access Journals (Sweden)

    Yiyi Sun

    Full Text Available Adiponectin, the adipose-derived hormone, plays an important role in the suppression of metabolic disorders that can result in type 2 diabetes, obesity, and atherosclerosis. It has been shown that up-regulation of adiponectin or adiponectin receptor has a number of therapeutic benefits. Given that it is hard to convert the full size adiponectin protein into a viable drug, adiponectin receptor agonists could be designed or identified using high-throughput screening. Here, we report on the development of a two-step screening process to identify adiponectin agonists. First step, we developed a high throughput screening assay based on fluorescence polarization to identify adiponectin ligands. The fluorescence polarization assay reported here could be adapted to screening against larger small molecular compound libraries. A natural product library containing 10,000 compounds was screened and 9 hits were selected for validation. These compounds have been taken for the second-step in vitro tests to confirm their agonistic activity. The most active adiponectin receptor 1 agonists are matairesinol, arctiin, (--arctigenin and gramine. The most active adiponectin receptor 2 agonists are parthenolide, taxifoliol, deoxyschizandrin, and syringin. These compounds may be useful drug candidates for hypoadiponectin related diseases.

  7. AlphaScreen-based homogeneous assay using a pair of 25-residue artificial proteins for high-throughput analysis of non-native IgG.

    Science.gov (United States)

    Senga, Yukako; Imamura, Hiroshi; Miyafusa, Takamitsu; Watanabe, Hideki; Honda, Shinya

    2017-09-29

    Therapeutic IgG becomes unstable under various stresses in the manufacturing process. The resulting non-native IgG molecules tend to associate with each other and form aggregates. Because such aggregates not only decrease the pharmacological effect but also become a potential risk factor for immunogenicity, rapid analysis of aggregation is required for quality control of therapeutic IgG. In this study, we developed a homogeneous assay using AlphaScreen and AF.2A1. AF.2A1 is a 25-residue artificial protein that binds specifically to non-native IgG generated under chemical and physical stresses. This assay is performed in a short period of time. Our results show that AF.2A1-AlphaScreen may be used to evaluate the various types of IgG, as AF.2A1 recognizes the non-native structure in the constant region (Fc region) of IgG. The assay was effective for detection of non-native IgG, with particle size up to ca. 500 nm, generated under acid, heat, and stirring conditions. In addition, this technique is suitable for analyzing non-native IgG in CHO cell culture supernatant and mixed with large amounts of native IgG. These results indicate the potential of AF.2A1-AlphaScreen to be used as a high-throughput evaluation method for process monitoring as well as quality testing in the manufacturing of therapeutic IgG.

  8. Intuitive web-based experimental design for high-throughput biomedical data.

    Science.gov (United States)

    Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven

    2015-01-01

    Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.

  9. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  10. A pocket device for high-throughput optofluidic holographic microscopy

    Science.gov (United States)

    Mandracchia, B.; Bianco, V.; Wang, Z.; Paturzo, M.; Bramanti, A.; Pioggia, G.; Ferraro, P.

    2017-06-01

    Here we introduce a compact holographic microscope embedded onboard a Lab-on-a-Chip (LoC) platform. A wavefront division interferometer is realized by writing a polymer grating onto the channel to extract a reference wave from the object wave impinging the LoC. A portion of the beam reaches the samples flowing along the channel path, carrying their information content to the recording device, while one of the diffraction orders from the grating acts as an off-axis reference wave. Polymeric micro-lenses are delivered forward the chip by Pyro-ElectroHydroDynamic (Pyro-EHD) inkjet printing techniques. Thus, all the required optical components are embedded onboard a pocket device, and fast, non-iterative, reconstruction algorithms can be used. We use our device in combination with a novel high-throughput technique, named Space-Time Digital Holography (STDH). STDH exploits the samples motion inside microfluidic channels to obtain a synthetic hologram, mapped in a hybrid space-time domain, and with intrinsic useful features. Indeed, a single Linear Sensor Array (LSA) is sufficient to build up a synthetic representation of the entire experiment (i.e. the STDH) with unlimited Field of View (FoV) along the scanning direction, independently from the magnification factor. The throughput of the imaging system is dramatically increased as STDH provides unlimited FoV, refocusable imaging of samples inside the liquid volume with no need for hologram stitching. To test our embedded STDH microscopy module, we counted, imaged and tracked in 3D with high-throughput red blood cells moving inside the channel volume under non ideal flow conditions.

  11. High-Throughput Lipolysis in 96-Well Plates for Rapid Screening of Lipid-Based Drug Delivery Systems

    DEFF Research Database (Denmark)

    Mosgaard, Mette D; Sassene, Philip J; Mu, Huiling

    2017-01-01

    The high-throughput in vitro intestinal lipolysis model (HTP) applicable for rapid and low-scale screening of lipid-based drug delivery systems (LbDDSs) was optimized and adjusted as to be conducted in 96-well plates (HTP-96). Three different LbDDSs (I-III) loaded with danazol or cinnarizine were...

  12. Molecular characterization of constitutive heterochromatin in three species of Trypoxylon (Hymenoptera, Crabronidae, Trypoxylini by CMA3/DAPI staining

    Directory of Open Access Journals (Sweden)

    Rodolpho Menezes

    2011-07-01

    Full Text Available Previous cytogenetic analyses in Trypoxylon Latreille, 1796 have been basically restricted to C-banding. In the present study, base-specific CMA3 and DAPI fluorochrome staining were used to characterize the constitutive heterochromatin in three Trypoxylon species. The heterochromatin was GC-rich in all the species studied; however, in Trypoxylon nitidum F. Smith, 1856 the molecular composition of the heterochromatin was different among chromosome pairs. Conversely, the euchromatin was AT-rich in the three species. These results suggest high conservatism in the euchromatic regions as opposed to the heterochromatic regions that have a high rate of changes. In this study, we report the karyotype of Trypoxylon rugifrons F. Smith, 1873 which has the lowest chromosome number in the genus and other characteristics of the likely ancestral Trypoxylon karyotype.

  13. Influence of non-radioactive payload parameters on radioactive shipping packages

    International Nuclear Information System (INIS)

    Drez, P.E.; Murthy, D.V.S.; Temus, C.J.; Quinn, G.J.; Ozaki, C.

    1989-01-01

    The transport of radioactive waste materials in radioactive material (RAM) packages involves two components: the packaging used for transportation, and the waste which forms the payload. The payload is usually comprised of non-radioactive materials contaminated with radionuclides. The non-radionuclide payload characteristics can often be a controlling factor in determining the restrictions imposed on the certification of the package. This paper describes these package/payload interactions and the limiting parameters for the Transuranic Package Transporter-II (TRUPACT-II), designed for the transportation of Contact Handled Transuranic (CH-TRU) waste. The parameters discussed include the physical and chemical form of the payload, the configuration of the waste, and resulting gas generation and gas release phenomena. Brief descriptions of the TRUPACT-II package and its payload are presented initially

  14. High-Throughput Automatic Training System for Odor-Based Learned Behaviors in Head-Fixed Mice

    Directory of Open Access Journals (Sweden)

    Zhe Han

    2018-02-01

    Full Text Available Understanding neuronal mechanisms of learned behaviors requires efficient behavioral assays. We designed a high-throughput automatic training system (HATS for olfactory behaviors in head-fixed mice. The hardware and software were constructed to enable automatic training with minimal human intervention. The integrated system was composed of customized 3D-printing supporting components, an odor-delivery unit with fast response, Arduino based hardware-controlling and data-acquisition unit. Furthermore, the customized software was designed to enable automatic training in all training phases, including lick-teaching, shaping and learning. Using HATS, we trained mice to perform delayed non-match to sample (DNMS, delayed paired association (DPA, Go/No-go (GNG, and GNG reversal tasks. These tasks probed cognitive functions including sensory discrimination, working memory, decision making and cognitive flexibility. Mice reached stable levels of performance within several days in the tasks. HATS enabled an experimenter to train eight mice simultaneously, therefore greatly enhanced the experimental efficiency. Combined with causal perturbation and activity recording techniques, HATS can greatly facilitate our understanding of the neural-circuitry mechanisms underlying learned behaviors.

  15. High-throughput electrical characterization for robust overlay lithography control

    Science.gov (United States)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  16. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  17. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  18. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  19. Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections

    Science.gov (United States)

    Tchorowski, Nicole; Murawski, Robert

    2014-01-01

    New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option

  20. Radioactive and non-radioactive polychlorinated biphenyl (PCB) management at Hanford

    International Nuclear Information System (INIS)

    Leonard, W.W.; Gretzinger, R.F.; Cox, G.R.

    1986-01-01

    Conformance to all state and federal regulations is the goal of Rockwell in the management of both radioactive and non-radioactive PCB's at Hanford. A continuing effort is being made to locate, remove, and properly dispose of all PCB's. As improved methods of management are developed, consideration will be given to them for their adaptation into the Hanford Site PCB Management Plan

  1. Fluorescence-based high-throughput functional profiling of ligand-gated ion channels at the level of single cells.

    Directory of Open Access Journals (Sweden)

    Sahil Talwar

    Full Text Available Ion channels are involved in many physiological processes and are attractive targets for therapeutic intervention. Their functional properties vary according to their subunit composition, which in turn varies in a developmental and tissue-specific manner and as a consequence of pathophysiological events. Understanding this diversity requires functional analysis of ion channel properties in large numbers of individual cells. Functional characterisation of ligand-gated channels involves quantitating agonist and drug dose-response relationships using electrophysiological or fluorescence-based techniques. Electrophysiology is limited by low throughput and high-throughput fluorescence-based functional evaluation generally does not enable the characterization of the functional properties of each individual cell. Here we describe a fluorescence-based assay that characterizes functional channel properties at single cell resolution in high throughput mode. It is based on progressive receptor activation and iterative fluorescence imaging and delivers >100 dose-responses in a single well of a 384-well plate, using α1-3 homomeric and αβ heteromeric glycine receptor (GlyR chloride channels as a model system. We applied this assay with transiently transfected HEK293 cells co-expressing halide-sensitive yellow fluorescent protein and different GlyR subunit combinations. Glycine EC50 values of different GlyR isoforms were highly correlated with published electrophysiological data and confirm previously reported pharmacological profiles for the GlyR inhibitors, picrotoxin, strychnine and lindane. We show that inter and intra well variability is low and that clustering of functional phenotypes permits identification of drugs with subunit-specific pharmacological profiles. As this method dramatically improves the efficiency with which ion channel populations can be characterized in the context of cellular heterogeneity, it should facilitate systems

  2. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  3. A direct comparison of remote sensing approaches for high-throughput phenotyping in plant breeding

    Directory of Open Access Journals (Sweden)

    Maria Tattaris

    2016-08-01

    Full Text Available Remote sensing (RS of plant canopies permits non-intrusive, high-throughput monitoring of plant physiological characteristics. This study compared three RS approaches using a low flying UAV (unmanned aerial vehicle, with that of proximal sensing, and satellite-based imagery. Two physiological traits were considered, canopy temperature (CT and a vegetation index (NDVI, to determine the most viable approaches for large scale crop genetic improvement. The UAV-based platform achieves plot-level resolution while measuring several hundred plots in one mission via high-resolution thermal and multispectral imagery measured at altitudes of 30-100 m. The satellite measures multispectral imagery from an altitude of 770 km. Information was compared with proximal measurements using IR thermometers and an NDVI sensor at a distance of 0.5-1m above plots. For robust comparisons, CT and NDVI were assessed on panels of elite cultivars under irrigated and drought conditions, in different thermal regimes, and on un-adapted genetic resources under water deficit. Correlations between airborne data and yield/biomass at maturity were generally higher than equivalent proximal correlations. NDVI was derived from high-resolution satellite imagery for only larger sized plots (8.5 x 2.4 m due to restricted pixel density. Results support use of UAV-based RS techniques for high-throughput phenotyping for both precision and efficiency.

  4. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  5. A high-throughput screening strategy for nitrile-hydrolyzing enzymes based on ferric hydroxamate spectrophotometry.

    Science.gov (United States)

    He, Yu-Cai; Ma, Cui-Luan; Xu, Jian-He; Zhou, Li

    2011-02-01

    Nitrile-hydrolyzing enzymes (nitrilase or nitrile hydratase/amidase) have been widely used in the pharmaceutical industry for the production of carboxylic acids and their derivatives, and it is important to build a method for screening for nitrile-hydrolyzing enzymes. In this paper, a simple, rapid, and high-throughput screening method based on the ferric hydroxamate spectrophotometry has been proposed. To validate the accuracy of this screening strategy, the nitrilases from Rhodococcus erythropolis CGMCC 1.2362 and Alcaligenes sp. ECU0401 were used for evaluating the method. As a result, the accuracy for assaying aliphatic and aromatic carboxylic acids was as high as the HPLC-based method. Therefore, the method may be potentially used in the selection of microorganisms or engineered proteins with nitrile-hydrolyzing enzymes.

  6. A bead-based western for high-throughput cellular signal transduction analyses

    Science.gov (United States)

    Treindl, Fridolin; Ruprecht, Benjamin; Beiter, Yvonne; Schultz, Silke; Döttinger, Anette; Staebler, Annette; Joos, Thomas O.; Kling, Simon; Poetz, Oliver; Fehm, Tanja; Neubauer, Hans; Kuster, Bernhard; Templin, Markus F.

    2016-01-01

    Dissecting cellular signalling requires the analysis of large number of proteins. The DigiWest approach we describe here transfers the western blot to a bead-based microarray platform. By combining gel-based protein separation with immobilization on microspheres, hundreds of replicas of the initial blot are created, thus enabling the comprehensive analysis of limited material, such as cells collected by laser capture microdissection, and extending traditional western blotting to reach proteomic scales. The combination of molecular weight resolution, sensitivity and signal linearity on an automated platform enables the rapid quantification of hundreds of specific proteins and protein modifications in complex samples. This high-throughput western blot approach allowed us to identify and characterize alterations in cellular signal transduction that occur during the development of resistance to the kinase inhibitor Lapatinib, revealing major changes in the activation state of Ephrin-mediated signalling and a central role for p53-controlled processes. PMID:27659302

  7. High-Throughput Non-destructive Phenotyping of Traits that Contribute to Salinity Tolerance in Arabidopsis thaliana

    KAUST Repository

    Awlia, Mariam

    2016-09-28

    Reproducible and efficient high-throughput phenotyping approaches, combined with advances in genome sequencing, are facilitating the discovery of genes affecting plant performance. Salinity tolerance is a desirable trait that can be achieved through breeding, where most have aimed at selecting for plants that perform effective ion exclusion from the shoots. To determine overall plant performance under salt stress, it is helpful to investigate several plant traits collectively in one experimental setup. Hence, we developed a quantitative phenotyping protocol using a high-throughput phenotyping system, with RGB and chlorophyll fluorescence (ChlF) imaging, which captures the growth, morphology, color and photosynthetic performance of Arabidopsis thaliana plants in response to salt stress. We optimized our salt treatment by controlling the soil-water content prior to introducing salt stress. We investigated these traits over time in two accessions in soil at 150, 100, or 50 mM NaCl to find that the plants subjected to 100 mM NaCl showed the most prominent responses in the absence of symptoms of severe stress. In these plants, salt stress induced significant changes in rosette area and morphology, but less prominent changes in rosette coloring and photosystem II efficiency. Clustering of ChlF traits with plant growth of nine accessions maintained at 100 mM NaCl revealed that in the early stage of salt stress, salinity tolerance correlated with non-photochemical quenching processes and during the later stage, plant performance correlated with quantum yield. This integrative approach allows the simultaneous analysis of several phenotypic traits. In combination with various genetic resources, the phenotyping protocol described here is expected to increase our understanding of plant performance and stress responses, ultimately identifying genes that improve plant performance in salt stress conditions.

  8. Quantum dots for a high-throughput Pfu polymerase based multi-round polymerase chain reaction (PCR).

    Science.gov (United States)

    Sang, Fuming; Zhang, Zhizhou; Yuan, Lin; Liu, Deli

    2018-02-26

    Multi-round PCR is an important technique for obtaining enough target DNA from rare DNA resources, and is commonly used in many fields including forensic science, ancient DNA analysis and cancer research. However, multi-round PCR is often aborted, largely due to the accumulation of non-specific amplification during repeated amplifications. Here, we developed a Pfu polymerase based multi-round PCR technique assisted by quantum dots (QDs). Different PCR assays, DNA polymerases (Pfu and Taq), DNA sizes and GC amounts were compared in this study. In the presence of QDs, PCR specificity could be retained even in the ninth-round amplification. Moreover, the longer and more complex the targets were, the earlier the abortion happened in multi-round PCR. However, no obvious enhancement of specificity was found in multi-round PCR using Taq DNA polymerase. Significantly, the fidelity of Pfu polymerase based multi-round PCR was not sacrificed in the presence of QDs. Besides, pre-incubation at 50 °C for an hour had no impact on multi-round PCR performance, which further authenticated the hot start effect of QDs modulated in multi-round PCR. The findings of this study demonstrated that a cost-effective and promising multi-round PCR technique for large-scale and high-throughput sample analysis could be established with high specificity, sensibility and accuracy.

  9. Centroid based clustering of high throughput sequencing reads based on n-mer counts.

    Science.gov (United States)

    Solovyov, Alexander; Lipkin, W Ian

    2013-09-08

    Many problems in computational biology require alignment-free sequence comparisons. One of the common tasks involving sequence comparison is sequence clustering. Here we apply methods of alignment-free comparison (in particular, comparison using sequence composition) to the challenge of sequence clustering. We study several centroid based algorithms for clustering sequences based on word counts. Study of their performance shows that using k-means algorithm with or without the data whitening is efficient from the computational point of view. A higher clustering accuracy can be achieved using the soft expectation maximization method, whereby each sequence is attributed to each cluster with a specific probability. We implement an open source tool for alignment-free clustering. It is publicly available from github: https://github.com/luscinius/afcluster. We show the utility of alignment-free sequence clustering for high throughput sequencing analysis despite its limitations. In particular, it allows one to perform assembly with reduced resources and a minimal loss of quality. The major factor affecting performance of alignment-free read clustering is the length of the read.

  10. High Level Radioactive Waste Management

    International Nuclear Information System (INIS)

    1991-01-01

    The proceedings of the second annual international conference on High Level Radioactive Waste Management, held on April 28--May 3, 1991, Las Vegas, Nevada, provides information on the current technical issue related to international high level radioactive waste management activities and how they relate to society as a whole. Besides discussing such technical topics as the best form of the waste, the integrity of storage containers, design and construction of a repository, the broader social aspects of these issues are explored in papers on such subjects as conformance to regulations, transportation safety, and public education. By providing this wider perspective of high level radioactive waste management, it becomes apparent that the various disciplines involved in this field are interrelated and that they should work to integrate their waste management activities. Individual records are processed separately for the data bases

  11. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  12. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  13. glbase: a framework for combining, analyzing and displaying heterogeneous genomic and high-throughput sequencing data

    Directory of Open Access Journals (Sweden)

    Andrew Paul Hutchins

    2014-01-01

    Full Text Available Genomic datasets and the tools to analyze them have proliferated at an astonishing rate. However, such tools are often poorly integrated with each other: each program typically produces its own custom output in a variety of non-standard file formats. Here we present glbase, a framework that uses a flexible set of descriptors that can quickly parse non-binary data files. glbase includes many functions to intersect two lists of data, including operations on genomic interval data and support for the efficient random access to huge genomic data files. Many glbase functions can produce graphical outputs, including scatter plots, heatmaps, boxplots and other common analytical displays of high-throughput data such as RNA-seq, ChIP-seq and microarray expression data. glbase is designed to rapidly bring biological data into a Python-based analytical environment to facilitate analysis and data processing. In summary, glbase is a flexible and multifunctional toolkit that allows the combination and analysis of high-throughput data (especially next-generation sequencing and genome-wide data, and which has been instrumental in the analysis of complex data sets. glbase is freely available at http://bitbucket.org/oaxiom/glbase/.

  14. glbase: a framework for combining, analyzing and displaying heterogeneous genomic and high-throughput sequencing data.

    Science.gov (United States)

    Hutchins, Andrew Paul; Jauch, Ralf; Dyla, Mateusz; Miranda-Saavedra, Diego

    2014-01-01

    Genomic datasets and the tools to analyze them have proliferated at an astonishing rate. However, such tools are often poorly integrated with each other: each program typically produces its own custom output in a variety of non-standard file formats. Here we present glbase, a framework that uses a flexible set of descriptors that can quickly parse non-binary data files. glbase includes many functions to intersect two lists of data, including operations on genomic interval data and support for the efficient random access to huge genomic data files. Many glbase functions can produce graphical outputs, including scatter plots, heatmaps, boxplots and other common analytical displays of high-throughput data such as RNA-seq, ChIP-seq and microarray expression data. glbase is designed to rapidly bring biological data into a Python-based analytical environment to facilitate analysis and data processing. In summary, glbase is a flexible and multifunctional toolkit that allows the combination and analysis of high-throughput data (especially next-generation sequencing and genome-wide data), and which has been instrumental in the analysis of complex data sets. glbase is freely available at http://bitbucket.org/oaxiom/glbase/.

  15. Spectrophotometric Enzyme Assays for High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Jean-Louis Reymond

    2004-01-01

    Full Text Available This paper reviews high-throughput screening enzyme assays developed in our laboratory over the last ten years. These enzyme assays were initially developed for the purpose of discovering catalytic antibodies by screening cell culture supernatants, but have proved generally useful for testing enzyme activities. Examples include TLC-based screening using acridone-labeled substrates, fluorogenic assays based on the β-elimination of umbelliferone or nitrophenol, and indirect assays such as the back-titration method with adrenaline and the copper-calcein fluorescence assay for aminoacids.

  16. Developing a novel fiber optic fluorescence device for multiplexed high-throughput cytotoxic screening.

    Science.gov (United States)

    Lee, Dennis; Barnes, Stephen

    2010-01-01

    The need for new pharmacological agents is unending. Yet the drug discovery process has changed substantially over the past decade and continues to evolve in response to new technologies. There is presently a high demand to reduce discovery time by improving specific lab disciplines and developing new technology platforms in the area of cell-based assay screening. Here we present the developmental concept and early stage testing of the Ab-Sniffer, a novel fiber optic fluorescence device for high-throughput cytotoxicity screening using an immobilized whole cell approach. The fused silica fibers are chemically functionalized with biotin to provide interaction with fluorescently labeled, streptavidin functionalized alginate-chitosan microspheres. The microspheres are also functionalized with Concanavalin A to facilitate binding to living cells. By using lymphoma cells and rituximab in an adaptation of a well-known cytotoxicity protocol we demonstrate the utility of the Ab-Sniffer for functional screening of potential drug compounds rather than indirect, non-functional screening via binding assay. The platform can be extended to any assay capable of being tied to a fluorescence response including multiple target cells in each well of a multi-well plate for high-throughput screening.

  17. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  18. PUFKEY: A High-Security and High-Throughput Hardware True Random Number Generator for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Dongfang Li

    2015-10-01

    Full Text Available Random number generators (RNG play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST randomness tests and is resilient to a wide range of security attacks.

  19. PUFKEY: a high-security and high-throughput hardware true random number generator for sensor networks.

    Science.gov (United States)

    Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin

    2015-10-16

    Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.

  20. High-throughput identification of potential minor histocompatibility antigens by MHC tetramer-based screening

    DEFF Research Database (Denmark)

    Hombrink, Pleun; Hadrup, Sine R; Bakker, Arne

    2011-01-01

    the technical feasibility of high-throughput analysis of antigen-specific T-cell responses in small patient samples. However, the high-sensitivity of this approach requires the use of potential epitope sets that are not solely based on MHC binding, to prevent the frequent detection of T-cell responses that lack......T-cell recognition of minor histocompatibility antigens (MiHA) plays an important role in the graft-versus-tumor (GVT) effect of allogeneic stem cell transplantation (allo-SCT). However, the number of MiHA identified to date remains limited, making clinical application of MiHA reactive T......MHC-tetramer-based enrichment and multi-color flow cytometry. Using this approach, 71 peptide-reactive T-cell populations were generated. The isolation of a T-cell line specifically recognizing target cells expressing the MAP4K1(IMA) antigen demonstrates that identification of MiHA through this approach is in principle...

  1. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  2. Non-destructive nuclear forensics of radioactive samples

    Energy Technology Data Exchange (ETDEWEB)

    Rogge, R.B. [Canadian Neutron Beam Centre, Chalk River, ON (Canada); Alexander, Q.; Bentoumi, G.; Dimayuga, F. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Flacau, R. [Canadian Neutron Beam Centre, Chalk River, ON (Canada); Li, G.; Li, L.; Sur, B. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    It is a matter of public safety and security to be able to examine suspicious packages of unknown origin. If the package is radioactive and sealed (i.e., the radioactive materials contained in the package, including their chemical and physical forms, are unknown), there is a significant risk on how to handle the package and eventually safely dispose of its contents. Within the context of nuclear security, nuclear forensics helps address the key issue of identifying the nature and origin of radioactive and nuclear material in order to improve physical protection measures and prevent future theft or diversion of these materials. Nuclear forensics utilizes analytical techniques, destructive and non-destructive, developed for applications related to nuclear fuel cycles. This paper demonstrates the non-destructive examination techniques that can be used to inspect encapsulated radioactive samples. Results of γ spectroscopy, X-ray spectroscopy, neutron imaging, neutron diffraction, and delayed neutron analysis as applied to an examination of sealed capsules containing unknown radioactive materials are presented. The paper also highlights the value of these techniques to the overall nuclear forensic investigation to determine the origin of these unknown radioactive materials. (author)

  3. Effects of non-radioactive material around radioactive material on PET image quality

    International Nuclear Information System (INIS)

    Toshimitsu, Shinya; Yamane, Azusa; Hirokawa, Yutaka; Kangai, Yoshiharu

    2015-01-01

    Subcutaneous fat is a non-radioactive material surrounding the radioactive material. We developed a phantom, and examined the effect of subcutaneous fat on PET image quality. We created a cylindrical non-radioactive mimic of subcutaneous fat, placed it around a cylindrical phantom in up to three layers with each layer having a thickness of 20 mm to reproduce the obesity caused by subcutaneous fat. In the cylindrical phantom, hot spheres and cold spheres were arranged. The radioactivity concentration ratio between the hot spheres and B.G. was 4:1. The radioactivity concentration of B.G. was changed as follows : 1.33, 2.65, 4.00, and 5.30 kBq/mL. 3D-PET image were collected during 10 minutes. When the thickness of the mimicked subcutaneous fat increased from 0 mm to 60 mm, noise equivalent count decreased by 58.9-60.9% at each radioactivity concentration. On the other hand, the percentage of background variability increased 2.2-5.2 times. Mimic subcutaneous fat did not decrease the percentage contrast of the hot spheres, and did not affect the cold spheres. Subcutaneous fat decreases the noise equivalent count and increases the percentage of background variability, which degrades PET image quality. (author)

  4. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    Science.gov (United States)

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047

  5. High-throughput screening to enhance oncolytic virus immunotherapy

    Directory of Open Access Journals (Sweden)

    Allan KJ

    2016-04-01

    Full Text Available KJ Allan,1,2 David F Stojdl,1–3 SL Swift1 1Children’s Hospital of Eastern Ontario (CHEO Research Institute, 2Department of Biology, Microbiology and Immunology, 3Department of Pediatrics, University of Ottawa, Ottawa, ON, Canada Abstract: High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. Keywords: oncolytic, virus, screen, high-throughput, cancer, chemical, genomic, immunotherapy

  6. [New-generation high-throughput technologies based 'omics' research strategy in human disease].

    Science.gov (United States)

    Yang, Xu; Jiao, Rui; Yang, Lin; Wu, Li-Ping; Li, Ying-Rui; Wang, Jun

    2011-08-01

    In recent years, new-generation high-throughput technologies, including next-generation sequencing technology and mass spectrometry method, have been widely applied in solving biological problems, especially in human diseases field. This data driven, large-scale and industrialized research model enables the omnidirectional and multi-level study of human diseases from the perspectives of genomics, transcriptomics and proteomics levels, etc. In this paper, the latest development of the high-throughput technologies that applied in DNA, RNA, epigenomics, metagenomics including proteomics and some applications in translational medicine are reviewed. At genomics level, exome sequencing has been the hot spot of the recent research. However, the predominance of whole genome resequencing in detecting large structural variants within the whole genome level is coming to stand out as the drop of sequencing cost, which also makes it possible for personalized genome based medicine application. At trancriptomics level, e.g., small RNA sequencing can be used to detect known and predict unknown miRNA. Those small RNA could not only be the biomarkers for disease diagnosis and prognosis, but also show the potential of disease treatment. At proteomics level, e.g., target proteomics can be used to detect the possible disease-related protein or peptides, which can be useful index for clinical staging and typing. Furthermore, the application and development of trans-omics study in disease research are briefly introduced. By applying bioinformatics technologies for integrating multi-omics data, the mechanism, diagnosis and therapy of the disease are likely to be systemically explained and realized, so as to provide powerful tools for disease diagnosis and therapies.

  7. High-throughput and low-latency network communication with NetIO

    CERN Document Server

    AUTHOR|(CDS)2088631; The ATLAS collaboration

    2017-01-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low-latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS target exclusively the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They allow building distributed applications with a high-level approach and provide good performance. Unfortunately their usage usually limits developers to TCP/IP-based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath...

  8. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  9. Screening for methicillin-resistant Staphylococcus aureus in clinical swabs using a high-throughput real-time PCR-based method

    DEFF Research Database (Denmark)

    Ornskov, D; Kolmos, B; Bendix Horn, P

    2008-01-01

    2005, all patients and healthcare personnel have been screened for MRSA colonisation, involving analysis of 300-400 samples daily. To deal with this number of samples, a PCR-based method customised for high-throughput analysis and a system for fast reporting of MRSA carrier status were developed. Swab...... samples were incubated overnight in a selective tryptone soya broth and were analysed by PCR the following day. Using this strategy, non-colonised individuals were identified within 24 h, while MRSA-positive samples were analysed further by traditional microbiological methods to determine the resistance...... pattern. This is a cost-effective approach, as the greatest expense in hospitals involves the isolation of patients of unknown MRSA status. The method was evaluated by testing 2194 clinical samples, with a sensitivity and specificity of 100% and 94%, respectively. The analytical sensitivity was 97...

  10. Low Complexity Approach for High Throughput Belief-Propagation based Decoding of LDPC Codes

    Directory of Open Access Journals (Sweden)

    BOT, A.

    2013-11-01

    Full Text Available The paper proposes a low complexity belief propagation (BP based decoding algorithm for LDPC codes. In spite of the iterative nature of the decoding process, the proposed algorithm provides both reduced complexity and increased BER performances as compared with the classic min-sum (MS algorithm, generally used for hardware implementations. Linear approximations of check-nodes update function are used in order to reduce the complexity of the BP algorithm. Considering this decoding approach, an FPGA based hardware architecture is proposed for implementing the decoding algorithm, aiming to increase the decoder throughput. FPGA technology was chosen for the LDPC decoder implementation, due to its parallel computation and reconfiguration capabilities. The obtained results show improvements regarding decoding throughput and BER performances compared with state-of-the-art approaches.

  11. Evaluation of a New Remote Handling Design for High Throughput Annular Centrifugal Contactors

    International Nuclear Information System (INIS)

    Meikrantz, David H.; Garn, Troy G.; Law, Jack D.; Macaluso, Lawrence L.

    2009-01-01

    Advanced designs of nuclear fuel recycling plants are expected to include more ambitious goals for aqueous based separations including; higher separations efficiency, high-level waste minimization, and a greater focus on continuous processes to minimize cost and footprint. Therefore, Annular Centrifugal Contactors (ACCs) are destined to play a more important role for such future processing schemes. Previous efforts defined and characterized the performance of commercial 5 cm and 12.5 cm single-stage ACCs in a 'cold' environment. The next logical step, the design and evaluation of remote capable pilot scale ACCs in a 'hot' or radioactive environment was reported earlier. This report includes the development of remote designs for ACCs that can process the large throughput rates needed in future nuclear fuel recycling plants. Novel designs were developed for the remote interconnection of contactor units, clean-in-place and drain connections, and a new solids removal collection chamber. A three stage, 12.5 cm diameter rotor module has been constructed and evaluated for operational function and remote handling in highly radioactive environments. This design is scalable to commercial CINC ACC models from V-05 to V-20 with total throughput rates ranging from 20 to 650 liters per minute. The V-05R three stage prototype was manufactured by the commercial vendor for ACCs in the U.S., CINC mfg. It employs three standard V-05 clean-in-place (CIP) units modified for remote service and replacement via new methods of connection for solution inlets, outlets, drain and CIP. Hydraulic testing and functional checks were successfully conducted and then the prototype was evaluated for remote handling and maintenance suitability. Removal and replacement of the center position V-05R ACC unit in the three stage prototype was demonstrated using an overhead rail mounted PaR manipulator. This evaluation confirmed the efficacy of this innovative design for interconnecting and cleaning

  12. A method for high throughput bioelectrochemical research based on small scale microbial electrolysis cells

    KAUST Repository

    Call, Douglas F.

    2011-07-01

    There is great interest in studying exoelectrogenic microorganisms, but existing methods can require expensive electrochemical equipment and specialized reactors. We developed a simple system for conducting high throughput bioelectrochemical research using multiple inexpensive microbial electrolysis cells (MECs) built with commercially available materials and operated using a single power source. MECs were small crimp top serum bottles (5mL) with a graphite plate anode (92m 2/m 3) and a cathode of stainless steel (SS) mesh (86m 2/m 3), graphite plate, SS wire, or platinum wire. The highest volumetric current density (240A/m 3, applied potential of 0.7V) was obtained using a SS mesh cathode and a wastewater inoculum (acetate electron donor). Parallel operated MECs (single power source) did not lead to differences in performance compared to non-parallel operated MECs, which can allow for high throughput reactor operation (>1000 reactors) using a single power supply. The utility of this method for cultivating exoelectrogenic microorganisms was demonstrated through comparison of buffer effects on pure (Geobacter sulfurreducens and Geobacter metallireducens) and mixed cultures. Mixed cultures produced current densities equal to or higher than pure cultures in the different media, and current densities for all cultures were higher using a 50mM phosphate buffer than a 30mM bicarbonate buffer. Only the mixed culture was capable of sustained current generation with a 200mM phosphate buffer. These results demonstrate the usefulness of this inexpensive method for conducting in-depth examinations of pure and mixed exoelectrogenic cultures. © 2011 Elsevier B.V.

  13. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si......RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...

  14. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  15. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  16. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  17. Functional characterisation of homomeric ionotropic glutamate receptors GluR1-GluR6 in a fluorescence-based high throughput screening assay

    DEFF Research Database (Denmark)

    Strange, Mette; Bräuner-Osborne, Hans; Jensen, Anders A.

    2006-01-01

    We have constructed stable HEK293 cell lines expressing the rat ionotropic glutamate receptor subtypes GluR1(i), GluR2Q(i), GluR3(i), GluR4(i), GluR5Q and GluR6Q and characterised the pharmacological profiles of the six homomeric receptors in a fluorescence-based high throughput screening assay...... assay reported to date. We propose that high throughput screening of compound libraries at the six GluR-HEK293 cell lines could be helpful in the search for structurally and pharmacologically novel ligands acting at the receptors....

  18. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    Science.gov (United States)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  19. Development of a Rapid Fluorescence-Based High-Throughput Screening Assay to Identify Novel Kynurenine 3-Monooxygenase Inhibitor Scaffolds.

    Science.gov (United States)

    Jacobs, K R; Guillemin, G J; Lovejoy, D B

    2018-02-01

    Kynurenine 3-monooxygenase (KMO) is a well-validated therapeutic target for the treatment of neurodegenerative diseases, including Alzheimer's disease (AD) and Huntington's disease (HD). This work reports a facile fluorescence-based KMO assay optimized for high-throughput screening (HTS) that achieves a throughput approximately 20-fold higher than the fastest KMO assay currently reported. The screen was run with excellent performance (average Z' value of 0.80) from 110,000 compounds across 341 plates and exceeded all statistical parameters used to describe a robust HTS assay. A subset of molecules was selected for validation by ultra-high-performance liquid chromatography, resulting in the confirmation of a novel hit with an IC 50 comparable to that of the well-described KMO inhibitor Ro-61-8048. A medicinal chemistry program is currently underway to further develop our novel KMO inhibitor scaffolds.

  20. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  1. High-throughput screening (HTS) and modeling of the retinoid ...

    Science.gov (United States)

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  2. An RNA-Based Fluorescent Biosensor for High-Throughput Analysis of the cGAS-cGAMP-STING Pathway.

    Science.gov (United States)

    Bose, Debojit; Su, Yichi; Marcus, Assaf; Raulet, David H; Hammond, Ming C

    2016-12-22

    In mammalian cells, the second messenger (2'-5',3'-5') cyclic guanosine monophosphate-adenosine monophosphate (2',3'-cGAMP), is produced by the cytosolic DNA sensor cGAMP synthase (cGAS), and subsequently bound by the stimulator of interferon genes (STING) to trigger interferon response. Thus, the cGAS-cGAMP-STING pathway plays a critical role in pathogen detection, as well as pathophysiological conditions including cancer and autoimmune disorders. However, studying and targeting this immune signaling pathway has been challenging due to the absence of tools for high-throughput analysis. We have engineered an RNA-based fluorescent biosensor that responds to 2',3'-cGAMP. The resulting "mix-and-go" cGAS activity assay shows excellent statistical reliability as a high-throughput screening (HTS) assay and distinguishes between direct and indirect cGAS inhibitors. Furthermore, the biosensor enables quantitation of 2',3'-cGAMP in mammalian cell lysates. We envision this biosensor-based assay as a resource to study the cGAS-cGAMP-STING pathway in the context of infectious diseases, cancer immunotherapy, and autoimmune diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A method for high throughput bioelectrochemical research based on small scale microbial electrolysis cells

    KAUST Repository

    Call, Douglas F.; Logan, Bruce E.

    2011-01-01

    There is great interest in studying exoelectrogenic microorganisms, but existing methods can require expensive electrochemical equipment and specialized reactors. We developed a simple system for conducting high throughput bioelectrochemical

  4. SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.

    Science.gov (United States)

    Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao

    2014-08-08

    Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.

  5. Radioactive airborne species formed in the air in high energy accelerator tunnels

    International Nuclear Information System (INIS)

    Kondo, K.

    2005-01-01

    Many radioactive airborne species have been observed in the air of high energy accelerator tunnels during machine operation. Radiation protection against these induced airborne radioactivities is one of the key issues for radiation safety, especially at high-energy and high-intense proton accelerators such as the J-PARC (Japan Proton Accelerator Research Complex, Joint project of KEK and JAERI), which is now under construction at the TOKAI site of JAERI. Information on the chemical forms and particle sizes of airborne radioactivities is essential for the estimation of internal doses. For that purpose, the study on radioactive airborne species formed in the air of beam-line tunnels at high-energy accelerators have been extensively conducted by our group. For Be-7, Na-24, S-38, Cl-38,-39, C-11, and N-13, formed by various types of nuclear reactions including nuclear spallation reactions, their aerosol and gaseous fractions are determined by a filter technique. A parallel plate diffusion battery is used for the measurement of aerosol size distributions, and the formation of radioactive aerosols is explained by the attachment of radionuclides to ambient non-radioactive aerosols which are formed through radiation induced reactions. The chemical forms of gaseous species are also determined by using a selective collection method based on a filter technique. A review is given of the physico-chemical properties of these airborne radionuclides produced in the air of accelerator beam-line tunnels.

  6. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  7. Classification of large circulating tumor cells isolated with ultra-high throughput microfluidic Vortex technology

    Science.gov (United States)

    Che, James; Yu, Victor; Dhar, Manjima; Renier, Corinne; Matsumoto, Melissa; Heirich, Kyra; Garon, Edward B.; Goldman, Jonathan; Rao, Jianyu; Sledge, George W.; Pegram, Mark D.; Sheth, Shruti; Jeffrey, Stefanie S.; Kulkarni, Rajan P.; Sollier, Elodie; Di Carlo, Dino

    2016-01-01

    Circulating tumor cells (CTCs) are emerging as rare but clinically significant non-invasive cellular biomarkers for cancer patient prognosis, treatment selection, and treatment monitoring. Current CTC isolation approaches, such as immunoaffinity, filtration, or size-based techniques, are often limited by throughput, purity, large output volumes, or inability to obtain viable cells for downstream analysis. For all technologies, traditional immunofluorescent staining alone has been employed to distinguish and confirm the presence of isolated CTCs among contaminating blood cells, although cells isolated by size may express vastly different phenotypes. Consequently, CTC definitions have been non-trivial, researcher-dependent, and evolving. Here we describe a complete set of objective criteria, leveraging well-established cytomorphological features of malignancy, by which we identify large CTCs. We apply the criteria to CTCs enriched from stage IV lung and breast cancer patient blood samples using the High Throughput Vortex Chip (Vortex HT), an improved microfluidic technology for the label-free, size-based enrichment and concentration of rare cells. We achieve improved capture efficiency (up to 83%), high speed of processing (8 mL/min of 10x diluted blood, or 800 μL/min of whole blood), and high purity (avg. background of 28.8±23.6 white blood cells per mL of whole blood). We show markedly improved performance of CTC capture (84% positive test rate) in comparison to previous Vortex designs and the current FDA-approved gold standard CellSearch assay. The results demonstrate the ability to quickly collect viable and pure populations of abnormal large circulating cells unbiased by molecular characteristics, which helps uncover further heterogeneity in these cells. PMID:26863573

  8. A versatile, high through-put, bead-based phagocytosis assay for Plasmodium falciparum

    DEFF Research Database (Denmark)

    Lloyd, Yukie M.; Ngati, Elise P.; Salanti, Ali

    2017-01-01

    Antibody-mediated phagocytosis is an important immune effector mechanism against Plasmodium falciparum-infected erythrocytes (IE); however, current phagocytosis assays use IE collected from infected individuals or from in vitro cultures of P. falciparum, making them prone to high variation....... A simple, high-throughput flow cytometric assay was developed that uses THP-1 cells and fluorescent beads covalently-coupled with the malarial antigen VAR2CSA. The assay is highly repeatable, provides both the overall percent phagocytosis and semi-quantitates the number of antigen-coupled beads...

  9. A mobile, high-throughput semi-automated system for testing cognition in large non-primate animal models of Huntington disease.

    Science.gov (United States)

    McBride, Sebastian D; Perentos, Nicholas; Morton, A Jennifer

    2016-05-30

    For reasons of cost and ethical concerns, models of neurodegenerative disorders such as Huntington disease (HD) are currently being developed in farm animals, as an alternative to non-human primates. Developing reliable methods of testing cognitive function is essential to determining the usefulness of such models. Nevertheless, cognitive testing of farm animal species presents a unique set of challenges. The primary aims of this study were to develop and validate a mobile operant system suitable for high throughput cognitive testing of sheep. We designed a semi-automated testing system with the capability of presenting stimuli (visual, auditory) and reward at six spatial locations. Fourteen normal sheep were used to validate the system using a two-choice visual discrimination task. Four stages of training devised to acclimatise animals to the system are also presented. All sheep progressed rapidly through the training stages, over eight sessions. All sheep learned the 2CVDT and performed at least one reversal stage. The mean number of trials the sheep took to reach criterion in the first acquisition learning was 13.9±1.5 and for the reversal learning was 19.1±1.8. This is the first mobile semi-automated operant system developed for testing cognitive function in sheep. We have designed and validated an automated operant behavioural testing system suitable for high throughput cognitive testing in sheep and other medium-sized quadrupeds, such as pigs and dogs. Sheep performance in the two-choice visual discrimination task was very similar to that reported for non-human primates and strongly supports the use of farm animals as pre-clinical models for the study of neurodegenerative diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  11. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  12. Applications of high-throughput clonogenic survival assays in high-LET particle microbeams

    Directory of Open Access Journals (Sweden)

    Antonios eGeorgantzoglou

    2016-01-01

    Full Text Available Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-LET particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells’ clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells’ response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell’s capacity to divide at least 4-5 times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  13. Applications of High-Throughput Clonogenic Survival Assays in High-LET Particle Microbeams.

    Science.gov (United States)

    Georgantzoglou, Antonios; Merchant, Michael J; Jeynes, Jonathan C G; Mayhead, Natalie; Punia, Natasha; Butler, Rachel E; Jena, Rajesh

    2015-01-01

    Charged particle therapy is increasingly becoming a valuable tool in cancer treatment, mainly due to the favorable interaction of particle radiation with matter. Its application is still limited due, in part, to lack of data regarding the radiosensitivity of certain cell lines to this radiation type, especially to high-linear energy transfer (LET) particles. From the earliest days of radiation biology, the clonogenic survival assay has been used to provide radiation response data. This method produces reliable data but it is not optimized for high-throughput microbeam studies with high-LET radiation where high levels of cell killing lead to a very low probability of maintaining cells' clonogenic potential. A new method, therefore, is proposed in this paper, which could potentially allow these experiments to be conducted in a high-throughput fashion. Cells are seeded in special polypropylene dishes and bright-field illumination provides cell visualization. Digital images are obtained and cell detection is applied based on corner detection, generating individual cell targets as x-y points. These points in the dish are then irradiated individually by a micron field size high-LET microbeam. Post-irradiation, time-lapse imaging follows cells' response. All irradiated cells are tracked by linking trajectories in all time-frames, based on finding their nearest position. Cell divisions are detected based on cell appearance and individual cell temporary corner density. The number of divisions anticipated is low due to the high probability of cell killing from high-LET irradiation. Survival curves are produced based on cell's capacity to divide at least four to five times. The process is repeated for a range of doses of radiation. Validation shows the efficiency of the proposed cell detection and tracking method in finding cell divisions.

  14. Integration of an In Situ MALDI-Based High-Throughput Screening Process: A Case Study with Receptor Tyrosine Kinase c-MET.

    Science.gov (United States)

    Beeman, Katrin; Baumgärtner, Jens; Laubenheimer, Manuel; Hergesell, Karlheinz; Hoffmann, Martin; Pehl, Ulrich; Fischer, Frank; Pieck, Jan-Carsten

    2017-12-01

    Mass spectrometry (MS) is known for its label-free detection of substrates and products from a variety of enzyme reactions. Recent hardware improvements have increased interest in the use of matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS for high-throughput drug discovery. Despite interest in this technology, several challenges remain and must be overcome before MALDI-MS can be integrated as an automated "in-line reader" for high-throughput drug discovery. Two such hurdles include in situ sample processing and deposition, as well as integration of MALDI-MS for enzymatic screening assays that usually contain high levels of MS-incompatible components. Here we adapt our c-MET kinase assay to optimize for MALDI-MS compatibility and test its feasibility for compound screening. The pros and cons of the Echo (Labcyte) as a transfer system for in situ MALDI-MS sample preparation are discussed. We demonstrate that this method generates robust data in a 1536-grid format. We use the MALDI-MS to directly measure the ratio of c-MET substrate and phosphorylated product to acquire IC50 curves and demonstrate that the pharmacology is unaffected. The resulting IC50 values correlate well between the common label-based capillary electrophoresis and the label-free MALDI-MS detection method. We predict that label-free MALDI-MS-based high-throughput screening will become increasingly important and more widely used for drug discovery.

  15. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    Directory of Open Access Journals (Sweden)

    Salvo-Chirnside Eliane

    2011-12-01

    Full Text Available Abstract The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue. The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates can easily be fully processed (samples homogenised, RNA purified and quantified in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  16. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format.

    Science.gov (United States)

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-12-02

    The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  17. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  18. Management of radioactive waste from non-power applications in the Netherlands

    International Nuclear Information System (INIS)

    Codee, H.D.K.

    2002-01-01

    Radioactive waste results from the use of radioactive materials in hospitals, research establishments, industry and nuclear power plants. The Netherlands forms a good example of a country with a small and in the near future ending nuclear power programme. The radioactive waste from non-power applications therefore strongly influences the management choices. A dedicated waste management company COVRA, the Central Organisation for Radioactive Waste manages all radioactive waste produced in the Netherlands. For the small volume, but broad spectrum of radioactive waste, a management system was developed based on the principle to isolate, to control and to monitor the waste. Long-term storage is an important element in this management strategy. It is not seen as a 'wait and see' option but as a necessary step in the strategy that will ultimately result in final removal of the waste. Since the waste will remain retrievable for a long time new technologies and new disposal options can be applied when available and feasible. (author)

  19. Engineering materials for high level radioactive waste repository

    International Nuclear Information System (INIS)

    Wen Zhijian

    2009-01-01

    Radioactive wastes can arise from a wide range of human activities and have different physical and chemical forms with various radioactivity. The high level radioactive wastes (HLW)are characterized by nuclides of very high initial radioactivity, large thermal emissivity and the long life-term. The HLW disposal is highly concerned by the scientists and the public in the world. At present, the deep geological disposal is regarded as the most reasonable and effective way to safely dispose high-level radioactive wastes in the world. The conceptual model of HLW geological disposal in China is based on a multi-barrier system that combines an isolating geological environment with an engineering barrier system(EBS). The engineering materials in EBS include the vitrified HLW, canister, overpack, buffer materials and backfill materials. Referring to progress in the world, this paper presents the function, the requirement for material selection and design, and main scientific projects of R and D of engineering materials in HLW repository. (authors)

  20. Can the same principles be used for the management of radioactive and non-radioactive waste?

    International Nuclear Information System (INIS)

    Bengtsson, Gunnar.

    1989-01-01

    Non-radioactive waste has a much more complex composition than radioactive waste and appears in much larger quantities. The two types of waste have, however, some properties in common when it comes to their longterm impact on health and the environment. The occurrence in both of substances that may exist for generations and may cause cancer provides one example. Both types of waste also always occur together. It is therefore proposed that the same basic principles could be applied for the management of radioactive and non-radioactive waste. By doing so one may increase the efficiency of policy development, research and practical management. This is particurlarly importand for the very costly restoration of old disposal sites which have earlier been poorly managed. (author)

  1. High-throughput ab-initio dilute solute diffusion database.

    Science.gov (United States)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-19

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  2. High-throughput Screening for Protein-based Inheritance in S. cerevisiae.

    Science.gov (United States)

    Byers, James S; Jarosz, Daniel F

    2017-08-08

    The encoding of biological information that is accessible to future generations is generally achieved via changes to the DNA sequence. Long-lived inheritance encoded in protein conformation (rather than sequence) has long been viewed as paradigm-shifting but rare. The best characterized examples of such epigenetic elements are prions, which possess a self-assembling behavior that can drive the heritable manifestation of new phenotypes. Many archetypal prions display a striking N/Q-rich sequence bias and assemble into an amyloid fold. These unusual features have informed most screening efforts to identify new prion proteins. However, at least three known prions (including the founding prion, PrP Sc ) do not harbor these biochemical characteristics. We therefore developed an alternative method to probe the scope of protein-based inheritance based on a property of mass action: the transient overexpression of prion proteins increases the frequency at which they acquire a self-templating conformation. This paper describes a method for analyzing the capacity of the yeast ORFeome to elicit protein-based inheritance. Using this strategy, we previously found that >1% of yeast proteins could fuel the emergence of biological traits that were long-lived, stable, and arose more frequently than genetic mutation. This approach can be employed in high throughput across entire ORFeomes or as a targeted screening paradigm for specific genetic networks or environmental stimuli. Just as forward genetic screens define numerous developmental and signaling pathways, these techniques provide a methodology to investigate the influence of protein-based inheritance in biological processes.

  3. High-throughput characterization for solar fuels materials discovery

    Science.gov (United States)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  4. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  5. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  6. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  7. Space Link Extension (SLE) Emulation for High-Throughput Network Communication

    Science.gov (United States)

    Murawski, Robert W.; Tchorowski, Nicole; Golden, Bert

    2014-01-01

    As the data rate requirements for space communications increases, significant stress is placed not only on the wireless satellite communication links, but also on the ground networks which forward data from end-users to remote ground stations. These wide area network (WAN) connections add delay and jitter to the end-to-end satellite communication link, effects which can have significant impacts on the wireless communication link. It is imperative that any ground communication protocol can react to these effects such that the ground network does not become a bottleneck in the communication path to the satellite. In this paper, we present our SCENIC Emulation Lab testbed which was developed to test the CCSDS SLE protocol implementations proposed for use on future NASA communication networks. Our results show that in the presence of realistic levels of network delay, high-throughput SLE communication links can experience significant data rate throttling. Based on our observations, we present some insight into why this data throttling happens, and trace the probable issue back to non-optimal blocking communication which is sup-ported by the CCSDS SLE API recommended practices. These issues were presented as well to the SLE implementation developers which, based on our reports, developed a new release for SLE which we show fixes the SLE blocking issue and greatly improves the protocol throughput. In this paper, we also discuss future developments for our end-to-end emulation lab and how these improvements can be used to develop and test future space communication technologies.

  8. Reverse Phase Protein Arrays for High-Throughput Protein Measurements in Mammospheres

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    Protein Array (RPPA)-based readout format integrated into robotic siRNA screening. This technique would allow post-screening high-throughput quantification of protein changes. Recently, breast cancer stem cells (BCSCs) have attracted much attention, as a tumor- and metastasis-driving subpopulation...

  9. High throughput nanoimprint lithography for semiconductor memory applications

    Science.gov (United States)

    Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun

    2017-03-01

    Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non

  10. Development of high-throughput SNP-based genotyping in Acacia auriculiformis x A. mangium hybrids using short-read transcriptome data

    Directory of Open Access Journals (Sweden)

    Wong Melissa ML

    2012-12-01

    Full Text Available Abstract Background Next Generation Sequencing has provided comprehensive, affordable and high-throughput DNA sequences for Single Nucleotide Polymorphism (SNP discovery in Acacia auriculiformis and Acacia mangium. Like other non-model species, SNP detection and genotyping in Acacia are challenging due to lack of genome sequences. The main objective of this study is to develop the first high-throughput SNP genotyping assay for linkage map construction of A. auriculiformis x A. mangium hybrids. Results We identified a total of 37,786 putative SNPs by aligning short read transcriptome data from four parents of two Acacia hybrid mapping populations using Bowtie against 7,839 de novo transcriptome contigs. Given a set of 10 validated SNPs from two lignin genes, our in silico SNP detection approach is highly accurate (100% compared to the traditional in vitro approach (44%. Further validation of 96 SNPs using Illumina GoldenGate Assay gave an overall assay success rate of 89.6% and conversion rate of 37.5%. We explored possible factors lowering assay success rate by predicting exon-intron boundaries and paralogous genes of Acacia contigs using Medicago truncatula genome as reference. This assessment revealed that presence of exon-intron boundary is the main cause (50% of assay failure. Subsequent SNPs filtering and improved assay design resulted in assay success and conversion rate of 92.4% and 57.4%, respectively based on 768 SNPs genotyping. Analysis of clustering patterns revealed that 27.6% of the assays were not reproducible and flanking sequence might play a role in determining cluster compression. In addition, we identified a total of 258 and 319 polymorphic SNPs in A. auriculiformis and A. mangium natural germplasms, respectively. Conclusion We have successfully discovered a large number of SNP markers in A. auriculiformis x A. mangium hybrids using next generation transcriptome sequencing. By using a reference genome from the most closely

  11. High-level radioactive wastes

    International Nuclear Information System (INIS)

    Grissom, M.C.

    1982-10-01

    This bibliography contains 812 citations on high-level radioactive wastes included in the Department of Energy's Energy Data Base from January 1981 through July 1982. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number

  12. A high-throughput pipeline for the design of real-time PCR signatures

    Directory of Open Access Journals (Sweden)

    Reifman Jaques

    2010-06-01

    Full Text Available Abstract Background Pathogen diagnostic assays based on polymerase chain reaction (PCR technology provide high sensitivity and specificity. However, the design of these diagnostic assays is computationally intensive, requiring high-throughput methods to identify unique PCR signatures in the presence of an ever increasing availability of sequenced genomes. Results We present the Tool for PCR Signature Identification (TOPSI, a high-performance computing pipeline for the design of PCR-based pathogen diagnostic assays. The TOPSI pipeline efficiently designs PCR signatures common to multiple bacterial genomes by obtaining the shared regions through pairwise alignments between the input genomes. TOPSI successfully designed PCR signatures common to 18 Staphylococcus aureus genomes in less than 14 hours using 98 cores on a high-performance computing system. Conclusions TOPSI is a computationally efficient, fully integrated tool for high-throughput design of PCR signatures common to multiple bacterial genomes. TOPSI is freely available for download at http://www.bhsai.org/downloads/topsi.tar.gz.

  13. Sensitive non-radioactive detection of HIV-1

    DEFF Research Database (Denmark)

    Teglbjærg, Lars Stubbe; Nielsen, C; Hansen, J E

    1992-01-01

    This report describes the use of the polymerase chain reaction (PCR) for the non-radioactive detection of HIV-1 proviral genomic sequences in HIV-1 infected cells. We have developed a sensitive assay, using three different sets of nested primers and our results show that this method is superior...... to standard PCR for the detection of HIV-1 DNA. The assay described features the use of a simple and inexpensive sample preparation technique and a non-radioactive hybridization procedure for confirmation of results. To test the suitability of the assay for clinical purposes, we tested cell samples from 76...

  14. High-Throughput and Low-Latency Network Communication with NetIO

    Science.gov (United States)

    Schumacher, Jörn; Plessl, Christian; Vandelli, Wainer

    2017-10-01

    HPC network technologies like Infiniband, TrueScale or OmniPath provide low- latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS exclusively target the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They make it possible to build distributed applications with a high-level approach and provide good performance. Unfortunately, their usage usually limits developers to TCP/IP- based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath, this approach may not be very efficient compared to a direct use of native APIs. NetIO is a simple, novel asynchronous message service that can operate on Ethernet, Infiniband and similar network fabrics. In this paper the design and implementation of NetIO is presented and described, and its use is evaluated in comparison to other approaches. NetIO supports different high-level programming models and typical workloads of HEP applications. The ATLAS FELIX project [1] successfully uses NetIO as its central communication platform. The architecture of NetIO is described in this paper, including the user-level API and the internal data-flow design. The paper includes a performance evaluation of NetIO including throughput and latency measurements. The performance is compared against the state-of-the- art ZeroMQ message service. Performance measurements are performed in a lab environment with Ethernet and FDR Infiniband networks.

  15. Separation of non-hazardous, non-radioactive components from ICPP calcine via chlorination

    International Nuclear Information System (INIS)

    Nelson, L.O.

    1995-05-01

    A pyrochemical treatment method for separating non-radioactive from radioactive components in solid granular waste accumulated at the Idaho Chemical Processing Plant was investigated. The goal of this study was to obtain kinetic and chemical separation data on the reaction products of the chlorination of the solid waste, known as calcine. Thermodynamic equilibrium calculations were completed to verify that a separation of radioactive and non-radioactive calcine components was possible. Bench-scale chlorination experiments were completed subsequently in a variety of reactor configurations including: a fixed-bed reactor (reactive gases flowed around and not through the particle bed), a packed/fluidized-bed reactor, and a packed-bed reactor (reactive gases flowed through the particle bed). Chemical analysis of the reaction products generated during the chlorination experiments verified the predictions made by the equilibrium calculations. An empirical first-order kinetic rate expression was developed for each of the reactor configurations. 20 refs., 16 figs., 21 tabs

  16. Proposal of threshold levels for the definition of non-radioactive wastes

    International Nuclear Information System (INIS)

    Yoshida, Yoshikazu

    1979-01-01

    With increasing amounts of radioactive wastes along with the advances of nuclear power generation and radioactive material utilizations, the needs for management cost reduction and resource saving have arisen. Under the situation, the threshold levels for the definition of non-radioactive solid wastes are required. The problem has been studied by an ad hoc committee in Nuclear Safety Research Association, by the request of the Science and Technology Agency. The matters described are the procedures of deriving the threshold levels, the feasibility studies of the management of waste threshold-level with several enterprises, and future subjects of study. The threshold levels are grouped in two, i.e. the unconditional level and the conditional level. According to the unconditional threshold level, solid wastes are separated definitely into radioactive and non-radioactive ones. According to the conditional threshold level, under certain conditions, some radioactive solid wastes according to the unconditional level are regarded as non-radioactive ones. (J.P.N.)

  17. High resolution light-sheet based high-throughput imaging cytometry system enables visualization of intra-cellular organelles

    Science.gov (United States)

    Regmi, Raju; Mohan, Kavya; Mondal, Partha Pratim

    2014-09-01

    Visualization of intracellular organelles is achieved using a newly developed high throughput imaging cytometry system. This system interrogates the microfluidic channel using a sheet of light rather than the existing point-based scanning techniques. The advantages of the developed system are many, including, single-shot scanning of specimens flowing through the microfluidic channel at flow rate ranging from micro- to nano- lit./min. Moreover, this opens-up in-vivo imaging of sub-cellular structures and simultaneous cell counting in an imaging cytometry system. We recorded a maximum count of 2400 cells/min at a flow-rate of 700 nl/min, and simultaneous visualization of fluorescently-labeled mitochondrial network in HeLa cells during flow. The developed imaging cytometry system may find immediate application in biotechnology, fluorescence microscopy and nano-medicine.

  18. Receptor-based high-throughput screening and identification of estrogens in dietary supplements using bioaffinity liquid-chromatography ion mobility mass spectrometry.

    Science.gov (United States)

    Aqai, Payam; Blesa, Natalia Gómez; Major, Hilary; Pedotti, Mattia; Varani, Luca; Ferrero, Valentina E V; Haasnoot, Willem; Nielen, Michel W F

    2013-11-01

    A high-throughput bioaffinity liquid chromatography-mass spectrometry (BioMS) approach was developed and applied for the screening and identification of recombinant human estrogen receptor α (ERα) ligands in dietary supplements. For screening, a semi-automated mass spectrometric ligand binding assay was developed applying (13)C2, (15) N-tamoxifen as non-radioactive label and fast ultra-high-performance-liquid chromatography-electrospray ionisation-triple-quadrupole-MS (UPLC-QqQ-MS), operated in the single reaction monitoring mode, as a readout system. Binding of the label to ERα-coated paramagnetic microbeads was inhibited by competing estrogens in the sample extract yielding decreased levels of the label in UPLC-QqQ-MS. The label showed high ionisation efficiency in positive electrospray ionisation (ESI) mode, so the developed BioMS approach is able to screen for estrogens in dietary supplements despite their poor ionisation efficiency in both positive and negative ESI modes. The assay was performed in a 96-well plate, and all these wells could be measured within 3 h. Estrogens in suspect extracts were identified by full-scan accurate mass and collision-cross section (CCS) values from a UPLC-ion mobility-Q-time-of-flight-MS (UPLC-IM-Q-ToF-MS) equipped with a novel atmospheric pressure ionisation source. Thanks to the novel ion source, this instrument provided picogram sensitivity for estrogens in the negative ion mode and an additional identification point (experimental CCS values) next to retention time, accurate mass and tandem mass spectrometry data. The developed combination of bioaffinity screening with UPLC-QqQ-MS and identification with UPLC-IM-Q-ToF-MS provides an extremely powerful analytical tool for early warning of ERα bioactive compounds in dietary supplements as demonstrated by analysis of selected dietary supplements in which different estrogens were identified.

  19. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  20. SNP high-throughput screening in grapevine using the SNPlex™ genotyping system

    Directory of Open Access Journals (Sweden)

    Velasco Riccardo

    2008-01-01

    Full Text Available Abstract Background Until recently, only a small number of low- and mid-throughput methods have been used for single nucleotide polymorphism (SNP discovery and genotyping in grapevine (Vitis vinifera L.. However, following completion of the sequence of the highly heterozygous genome of Pinot Noir, it has been possible to identify millions of electronic SNPs (eSNPs thus providing a valuable source for high-throughput genotyping methods. Results Herein we report the first application of the SNPlex™ genotyping system in grapevine aiming at the anchoring of an eukaryotic genome. This approach combines robust SNP detection with automated assay readout and data analysis. 813 candidate eSNPs were developed from non-repetitive contigs of the assembled genome of Pinot Noir and tested in 90 progeny of Syrah × Pinot Noir cross. 563 new SNP-based markers were obtained and mapped. The efficiency rate of 69% was enhanced to 80% when multiple displacement amplification (MDA methods were used for preparation of genomic DNA for the SNPlex assay. Conclusion Unlike other SNP genotyping methods used to investigate thousands of SNPs in a few genotypes, or a few SNPs in around a thousand genotypes, the SNPlex genotyping system represents a good compromise to investigate several hundred SNPs in a hundred or more samples simultaneously. Therefore, the use of the SNPlex assay, coupled with whole genome amplification (WGA, is a good solution for future applications in well-equipped laboratories.

  1. Blood group genotyping: from patient to high-throughput donor screening.

    Science.gov (United States)

    Veldhuisen, B; van der Schoot, C E; de Haas, M

    2009-10-01

    Blood group antigens, present on the cell membrane of red blood cells and platelets, can be defined either serologically or predicted based on the genotypes of genes encoding for blood group antigens. At present, the molecular basis of many antigens of the 30 blood group systems and 17 human platelet antigens is known. In many laboratories, blood group genotyping assays are routinely used for diagnostics in cases where patient red cells cannot be used for serological typing due to the presence of auto-antibodies or after recent transfusions. In addition, DNA genotyping is used to support (un)-expected serological findings. Fetal genotyping is routinely performed when there is a risk of alloimmune-mediated red cell or platelet destruction. In case of patient blood group antigen typing, it is important that a genotyping result is quickly available to support the selection of donor blood, and high-throughput of the genotyping method is not a prerequisite. In addition, genotyping of blood donors will be extremely useful to obtain donor blood with rare phenotypes, for example lacking a high-frequency antigen, and to obtain a fully typed donor database to be used for a better matching between recipient and donor to prevent adverse transfusion reactions. Serological typing of large cohorts of donors is a labour-intensive and expensive exercise and hampered by the lack of sufficient amounts of approved typing reagents for all blood group systems of interest. Currently, high-throughput genotyping based on DNA micro-arrays is a very feasible method to obtain a large pool of well-typed blood donors. Several systems for high-throughput blood group genotyping are developed and will be discussed in this review.

  2. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    Science.gov (United States)

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  3. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  4. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    Science.gov (United States)

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  5. Roche genome sequencer FLX based high-throughput sequencing of ancient DNA

    DEFF Research Database (Denmark)

    Alquezar-Planas, David E; Fordyce, Sarah Louise

    2012-01-01

    Since the development of so-called "next generation" high-throughput sequencing in 2005, this technology has been applied to a variety of fields. Such applications include disease studies, evolutionary investigations, and ancient DNA. Each application requires a specialized protocol to ensure...... that the data produced is optimal. Although much of the procedure can be followed directly from the manufacturer's protocols, the key differences lie in the library preparation steps. This chapter presents an optimized protocol for the sequencing of fossil remains and museum specimens, commonly referred...

  6. poolHiTS: A Shifted Transversal Design based pooling strategy for high-throughput drug screening

    Directory of Open Access Journals (Sweden)

    Woolf Peter J

    2008-05-01

    Full Text Available Abstract Background A key goal of drug discovery is to increase the throughput of small molecule screens without sacrificing screening accuracy. High-throughput screening (HTS in drug discovery involves testing a large number of compounds in a biological assay to identify active compounds. Normally, molecules from a large compound library are tested individually to identify the activity of each molecule. Usually a small number of compounds are found to be active, however the presence of false positive and negative testing errors suggests that this one-drug one-assay screening strategy can be significantly improved. Pooling designs are testing schemes that test mixtures of compounds in each assay, thereby generating a screen of the whole compound library in fewer tests. By repeatedly testing compounds in different combinations, pooling designs also allow for error-correction. These pooled designs, for specific experiment parameters, can be simply and efficiently created using the Shifted Transversal Design (STD pooling algorithm. However, drug screening contains a number of key constraints that require specific modifications if this pooling approach is to be useful for practical screen designs. Results In this paper, we introduce a pooling strategy called poolHiTS (Pooled High-Throughput Screening which is based on the STD algorithm. In poolHiTS, we implement a limit on the number of compounds that can be mixed in a single assay. In addition, we show that the STD-based pooling strategy is limited in the error-correction that it can achieve. Due to the mixing constraint, we show that it is more efficient to split a large library into smaller blocks of compounds, which are then tested using an optimized strategy repeated for each block. We package the optimal block selection algorithm into poolHiTS. The MATLAB codes for the poolHiTS algorithm and the corresponding decoding strategy are also provided. Conclusion We have produced a practical version

  7. Salinity tolerance loci revealed in rice using high-throughput non-invasive phenotyping

    KAUST Repository

    Al-Tamimi, Nadia Ali

    2016-11-17

    High-throughput phenotyping produces multiple measurements over time, which require new methods of analyses that are flexible in their quantification of plant growth and transpiration, yet are computationally economic. Here we develop such analyses and apply this to a rice population genotyped with a 700k SNP high-density array. Two rice diversity panels, indica and aus, containing a total of 553 genotypes, are phenotyped in waterlogged conditions. Using cubic smoothing splines to estimate plant growth and transpiration, we identify four time intervals that characterize the early responses of rice to salinity. Relative growth rate, transpiration rate and transpiration use efficiency (TUE) are analysed using a new association model that takes into account the interaction between treatment (control and salt) and genetic marker. This model allows the identification of previously undetected loci affecting TUE on chromosome 11, providing insights into the early responses of rice to salinity, in particular into the effects of salinity on plant growth and transpiration.

  8. Salinity tolerance loci revealed in rice using high-throughput non-invasive phenotyping

    KAUST Repository

    Al-Tamimi, Nadia Ali; Brien, Chris; Oakey, Helena; Berger, Bettina; Saade, Stephanie; Ho, Yung Shwen; Schmö ckel, Sandra M.; Tester, Mark A.; Negrã o, Só nia

    2016-01-01

    High-throughput phenotyping produces multiple measurements over time, which require new methods of analyses that are flexible in their quantification of plant growth and transpiration, yet are computationally economic. Here we develop such analyses and apply this to a rice population genotyped with a 700k SNP high-density array. Two rice diversity panels, indica and aus, containing a total of 553 genotypes, are phenotyped in waterlogged conditions. Using cubic smoothing splines to estimate plant growth and transpiration, we identify four time intervals that characterize the early responses of rice to salinity. Relative growth rate, transpiration rate and transpiration use efficiency (TUE) are analysed using a new association model that takes into account the interaction between treatment (control and salt) and genetic marker. This model allows the identification of previously undetected loci affecting TUE on chromosome 11, providing insights into the early responses of rice to salinity, in particular into the effects of salinity on plant growth and transpiration.

  9. A High-Throughput UHPLC-QqQ-MS Method for Polyphenol Profiling in Rosé Wines

    Directory of Open Access Journals (Sweden)

    Marine Lambert

    2015-04-01

    Full Text Available A rapid, sensitive and selective analysis method using Ultra High Performance Liquid Chromatography coupled to triple-quadrupole Mass Spectrometry (UHPLC-QqQ-MS has been developed for the quantification of polyphenols in rosé wines. The compound detection being based on specific MS transitions in Multiple Reaction Monitoring (MRM mode, the present method allows the selective quantification of up to 152 phenolic and two additional non-phenolic wine compounds in 30 min without sample purification or pre-concentration, even at low concentration levels. This method was repeatably applied to a set of 12 rosé wines and thus proved to be suitable for high-throughput and large-scale metabolomics studies.

  10. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  11. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  12. High-Throughput Screening of Heterogeneous Catalysts for the Conversion of Furfural to Bio-Based Fuel Components

    Directory of Open Access Journals (Sweden)

    Roberto Pizzi

    2015-12-01

    Full Text Available The one-pot catalytic reductive etherification of furfural to 2-methoxymethylfuran (furfuryl methyl ether, FME, a valuable bio-based chemical or fuel, is reported. A large number of commercially available hydrogenation heterogeneous catalysts based on nickel, copper, cobalt, iridium, palladium and platinum catalysts on various support were evaluated by a high-throughput screening approach. The reaction was carried out in liquid phase with a 10% w/w furfural in methanol solution at 50 bar of hydrogen. Among all the samples tested, carbon-supported noble metal catalysts were found to be the most promising in terms of productivity and selectivity. In particular, palladium on charcoal catalysts show high selectivity (up to 77% to FME. Significant amounts of furfuryl alcohol (FA and 2-methylfuran (2-MF are observed as the major by-products.

  13. Simultaneous measurements of auto-immune and infectious disease specific antibodies using a high throughput multiplexing tool.

    Directory of Open Access Journals (Sweden)

    Atul Asati

    Full Text Available Considering importance of ganglioside antibodies as biomarkers in various immune-mediated neuropathies and neurological disorders, we developed a high throughput multiplexing tool for the assessment of gangliosides-specific antibodies based on Biolpex/Luminex platform. In this report, we demonstrate that the ganglioside high throughput multiplexing tool is robust, highly specific and demonstrating ∼100-fold higher concentration sensitivity for IgG detection than ELISA. In addition to the ganglioside-coated array, the high throughput multiplexing tool contains beads coated with influenza hemagglutinins derived from H1N1 A/Brisbane/59/07 and H1N1 A/California/07/09 strains. Influenza beads provided an added advantage of simultaneous detection of ganglioside- and influenza-specific antibodies, a capacity important for the assay of both infectious antigen-specific and autoimmune antibodies following vaccination or disease. Taken together, these results support the potential adoption of the ganglioside high throughput multiplexing tool for measuring ganglioside antibodies in various neuropathic and neurological disorders.

  14. Argentine project for the final disposal of high-level radioactive wastes

    International Nuclear Information System (INIS)

    Palacios, E.; Ciallella, N.R.; Petraitis, E.J.

    1989-01-01

    From 1980 Argentina is carrying out a research program on the final disposal of high level radioactive wastes. The quantity of wastes produced will be significant in next century. However, it was decided to start with the studies well in advance in order to demonstrate that the high level wastes could be disposed in a safety way. The option of the direct disposal of irradiated fuel elements was discarded, not only by the energetic value of the plutonium, but also for ecological reasons. In fact, the presence of a total inventory of actinides in the non-processed fuel would imply a more important radiological impact than that caused if the plutonium is recycled to produce energy. The decision to solve the technological aspects connected with the elimination of high-level radioactive wastes well in advance, was made to avoid transfering the problem to future generations. This decision is based not only on technical evaluations but also on ethic premises. (Author)

  15. High-throughput measurement of polymer film thickness using optical dyes

    Science.gov (United States)

    Grunlan, Jaime C.; Mehrabi, Ali R.; Ly, Tien

    2005-01-01

    Optical dyes were added to polymer solutions in an effort to create a technique for high-throughput screening of dry polymer film thickness. Arrays of polystyrene films, cast from a toluene solution, containing methyl red or solvent green were used to demonstrate the feasibility of this technique. Measurements of the peak visible absorbance of each film were converted to thickness using the Beer-Lambert relationship. These absorbance-based thickness calculations agreed within 10% of thickness measured using a micrometer for polystyrene films that were 10-50 µm. At these thicknesses it is believed that the absorbance values are actually more accurate. At least for this solvent-based system, thickness was shown to be accurately measured in a high-throughput manner that could potentially be applied to other equivalent systems. Similar water-based films made with poly(sodium 4-styrenesulfonate) dyed with malachite green oxalate or congo red did not show the same level of agreement with the micrometer measurements. Extensive phase separation between polymer and dye resulted in inflated absorbance values and calculated thickness that was often more than 25% greater than that measured with the micrometer. Only at thicknesses below 15 µm could reasonable accuracy be achieved for the water-based films.

  16. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  17. CRISPR-Cas9 epigenome editing enables high-throughput screening for functional regulatory elements in the human genome.

    Science.gov (United States)

    Klann, Tyler S; Black, Joshua B; Chellappan, Malathi; Safi, Alexias; Song, Lingyun; Hilton, Isaac B; Crawford, Gregory E; Reddy, Timothy E; Gersbach, Charles A

    2017-06-01

    Large genome-mapping consortia and thousands of genome-wide association studies have identified non-protein-coding elements in the genome as having a central role in various biological processes. However, decoding the functions of the millions of putative regulatory elements discovered in these studies remains challenging. CRISPR-Cas9-based epigenome editing technologies have enabled precise perturbation of the activity of specific regulatory elements. Here we describe CRISPR-Cas9-based epigenomic regulatory element screening (CERES) for improved high-throughput screening of regulatory element activity in the native genomic context. Using dCas9 KRAB repressor and dCas9 p300 activator constructs and lentiviral single guide RNA libraries to target DNase I hypersensitive sites surrounding a gene of interest, we carried out both loss- and gain-of-function screens to identify regulatory elements for the β-globin and HER2 loci in human cells. CERES readily identified known and previously unidentified regulatory elements, some of which were dependent on cell type or direction of perturbation. This technology allows the high-throughput functional annotation of putative regulatory elements in their native chromosomal context.

  18. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  19. Patterning cell using Si-stencil for high-throughput assay

    KAUST Repository

    Wu, Jinbo

    2011-01-01

    In this communication, we report a newly developed cell pattering methodology by a silicon-based stencil, which exhibited advantages such as easy handling, reusability, hydrophilic surface and mature fabrication technologies. Cell arrays obtained by this method were used to investigate cell growth under a temperature gradient, which demonstrated the possibility of studying cell behavior in a high-throughput assay. This journal is © The Royal Society of Chemistry 2011.

  20. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  1. A high throughput biochemical fluorometric method for measuring lipid peroxidation in HDL.

    Directory of Open Access Journals (Sweden)

    Theodoros Kelesidis

    Full Text Available Current cell-based assays for determining the functional properties of high-density lipoproteins (HDL have limitations. We report here the development of a new, robust fluorometric cell-free biochemical assay that measures HDL lipid peroxidation (HDLox based on the oxidation of the fluorochrome Amplex Red. HDLox correlated with previously validated cell-based (r = 0.47, p<0.001 and cell-free assays (r = 0.46, p<0.001. HDLox distinguished dysfunctional HDL in established animal models of atherosclerosis and Human Immunodeficiency Virus (HIV patients. Using an immunoaffinity method for capturing HDL, we demonstrate the utility of this novel assay for measuring HDLox in a high throughput format. Furthermore, HDLox correlated significantly with measures of cardiovascular diseases including carotid intima media thickness (r = 0.35, p<0.01 and subendocardial viability ratio (r = -0.21, p = 0.05 and physiological parameters such as metabolic and anthropometric parameters (p<0.05. In conclusion, we report the development of a new fluorometric method that offers a reproducible and rapid means for determining HDL function/quality that is suitable for high throughput implementation.

  2. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H

    2017-01-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analy...

  3. Development and validation of a quantitative, high-throughput, fluorescent-based bioassay to detect schistosoma viability.

    Directory of Open Access Journals (Sweden)

    Emily Peak

    2010-07-01

    Full Text Available Schistosomiasis, caused by infection with the blood fluke Schistosoma, is responsible for greater than 200,000 human deaths per annum. Objective high-throughput screens for detecting novel anti-schistosomal targets will drive 'genome to drug' lead translational science at an unprecedented rate. Current methods for detecting schistosome viability rely on qualitative microscopic criteria, which require an understanding of parasite morphology, and most importantly, must be subjectively interpreted. These limitations, in the current state of the art, have significantly impeded progress into whole schistosome screening for next generation chemotherapies.We present here a microtiter plate-based method for reproducibly detecting schistosomula viability that takes advantage of the differential uptake of fluorophores (propidium iodide and fluorescein diacetate by living organisms. We validate this high-throughput system in detecting schistosomula viability using auranofin (a known inhibitor of thioredoxin glutathione reductase, praziquantel and a range of small compounds with previously-described (gambogic acid, sodium salinomycin, ethinyl estradiol, fluoxetidine hydrochloride, miconazole nitrate, chlorpromazine hydrochloride, amphotericin b, niclosamide or suggested (bepridil, ciclopirox, rescinnamine, flucytosine, vinblastine and carbidopa anti-schistosomal activities. This developed method is sensitive (200 schistosomula/well can be assayed, relevant to industrial (384-well microtiter plate compatibility and academic (96-well microtiter plate compatibility settings, translatable to functional genomics screens and drug assays, does not require a priori knowledge of schistosome biology and is quantitative.The wide-scale application of this fluorescence-based bioassay will greatly accelerate the objective identification of novel therapeutic lead targets/compounds to combat schistosomiasis. Adapting this bioassay for use with other parasitic worm species

  4. A high-throughput direct fluorescence resonance energy transfer-based assay for analyzing apoptotic proteases using flow cytometry and fluorescence lifetime measurements.

    Science.gov (United States)

    Suzuki, Miho; Sakata, Ichiro; Sakai, Takafumi; Tomioka, Hiroaki; Nishigaki, Koichi; Tramier, Marc; Coppey-Moisan, Maïté

    2015-12-15

    Cytometry is a versatile and powerful method applicable to different fields, particularly pharmacology and biomedical studies. Based on the data obtained, cytometric studies are classified into high-throughput (HTP) or high-content screening (HCS) groups. However, assays combining the advantages of both are required to facilitate research. In this study, we developed a high-throughput system to profile cellular populations in terms of time- or dose-dependent responses to apoptotic stimulations because apoptotic inducers are potent anticancer drugs. We previously established assay systems involving protease to monitor live cells for apoptosis using tunable fluorescence resonance energy transfer (FRET)-based bioprobes. These assays can be used for microscopic analyses or fluorescence-activated cell sorting. In this study, we developed FRET-based bioprobes to detect the activity of the apoptotic markers caspase-3 and caspase-9 via changes in bioprobe fluorescence lifetimes using a flow cytometer for direct estimation of FRET efficiencies. Different patterns of changes in the fluorescence lifetimes of these markers during apoptosis were observed, indicating a relationship between discrete steps in the apoptosis process. The findings demonstrate the feasibility of evaluating collective cellular dynamics during apoptosis. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  6. High-throughput gene expression profiling of memory differentiation in primary human T cells

    Directory of Open Access Journals (Sweden)

    Russell Kate

    2008-08-01

    Full Text Available Abstract Background The differentiation of naive T and B cells into memory lymphocytes is essential for immunity to pathogens. Therapeutic manipulation of this cellular differentiation program could improve vaccine efficacy and the in vitro expansion of memory cells. However, chemical screens to identify compounds that induce memory differentiation have been limited by 1 the lack of reporter-gene or functional assays that can distinguish naive and memory-phenotype T cells at high throughput and 2 a suitable cell-line representative of naive T cells. Results Here, we describe a method for gene-expression based screening that allows primary naive and memory-phenotype lymphocytes to be discriminated based on complex genes signatures corresponding to these differentiation states. We used ligation-mediated amplification and a fluorescent, bead-based detection system to quantify simultaneously 55 transcripts representing naive and memory-phenotype signatures in purified populations of human T cells. The use of a multi-gene panel allowed better resolution than any constituent single gene. The method was precise, correlated well with Affymetrix microarray data, and could be easily scaled up for high-throughput. Conclusion This method provides a generic solution for high-throughput differentiation screens in primary human T cells where no single-gene or functional assay is available. This screening platform will allow the identification of small molecules, genes or soluble factors that direct memory differentiation in naive human lymphocytes.

  7. EVpedia: an integrated database of high-throughput data for systemic analyses of extracellular vesicles

    Directory of Open Access Journals (Sweden)

    Dae-Kyum Kim

    2013-03-01

    Full Text Available Secretion of extracellular vesicles is a general cellular activity that spans the range from simple unicellular organisms (e.g. archaea; Gram-positive and Gram-negative bacteria to complex multicellular ones, suggesting that this extracellular vesicle-mediated communication is evolutionarily conserved. Extracellular vesicles are spherical bilayered proteolipids with a mean diameter of 20–1,000 nm, which are known to contain various bioactive molecules including proteins, lipids, and nucleic acids. Here, we present EVpedia, which is an integrated database of high-throughput datasets from prokaryotic and eukaryotic extracellular vesicles. EVpedia provides high-throughput datasets of vesicular components (proteins, mRNAs, miRNAs, and lipids present on prokaryotic, non-mammalian eukaryotic, and mammalian extracellular vesicles. In addition, EVpedia also provides an array of tools, such as the search and browse of vesicular components, Gene Ontology enrichment analysis, network analysis of vesicular proteins and mRNAs, and a comparison of vesicular datasets by ortholog identification. Moreover, publications on extracellular vesicle studies are listed in the database. This free web-based database of EVpedia (http://evpedia.info might serve as a fundamental repository to stimulate the advancement of extracellular vesicle studies and to elucidate the novel functions of these complex extracellular organelles.

  8. High-throughput theoretical design of lithium battery materials

    International Nuclear Information System (INIS)

    Ling Shi-Gang; Gao Jian; Xiao Rui-Juan; Chen Li-Quan

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. (topical review)

  9. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  10. Screening for Antifibrotic Compounds Using High Throughput System Based on Fluorescence Polarization

    Directory of Open Access Journals (Sweden)

    Branko Stefanovic

    2014-04-01

    Full Text Available Fibroproliferative diseases are one of the leading causes of death worldwide. They are characterized by reactive fibrosis caused by uncontrolled synthesis of type I collagen. There is no cure for fibrosis and development of therapeutics that can inhibit collagen synthesis is urgently needed. Collagen α1(I mRNA and α2(I mRNA encode for type I collagen and they have a unique 5' stem-loop structure in their 5' untranslated regions (5'SL. Collagen 5'SL binds protein LARP6 with high affinity and specificity. The interaction between LARP6 and the 5'SL is critical for biosynthesis of type I collagen and development of fibrosis in vivo. Therefore, this interaction represents is an ideal target to develop antifibrotic drugs. A high throughput system to screen for chemical compounds that can dissociate LARP6 from 5'SL has been developed. It is based on fluorescence polarization and can be adapted to screen for inhibitors of other protein-RNA interactions. Screening of 50,000 chemical compounds yielded a lead compound that can inhibit type I collagen synthesis at nanomolar concentrations. The development, characteristics, and critical appraisal of this assay are presented.

  11. Disposal of high level and intermediate level radioactive wastes

    International Nuclear Information System (INIS)

    Flowers, R.H.

    1991-01-01

    The waste products from the nuclear industry are relatively small in volume. Apart from a few minor gaseous and liquid waste streams, containing readily dispersible elements of low radiotoxicity, all these products are processed into stable solid packages for disposal in underground repositories. Because the volumes are small, and because radioactive wastes are latecomers on the industrial scene, a whole new industry with a world-wide technological infrastructure has grown up alongside the nuclear power industry to carry out the waste processing and disposal to very high standards. Some of the technical approaches used, and the Regulatory controls which have been developed, will undoubtedly find application in the future to the management of non-radioactive toxic wastes. The repository site outlined would contain even high-level radioactive wastes and spent fuels being contained without significant radiation dose rates to the public. Water pathway dose rates are likely to be lowest for vitrified high-level wastes with spent PWR fuel and intermediate level wastes being somewhat higher. (author)

  12. High-throughput purification of recombinant proteins using self-cleaving intein tags.

    Science.gov (United States)

    Coolbaugh, M J; Shakalli Tang, M J; Wood, D W

    2017-01-01

    High throughput methods for recombinant protein production using E. coli typically involve the use of affinity tags for simple purification of the protein of interest. One drawback of these techniques is the occasional need for tag removal before study, which can be hard to predict. In this work, we demonstrate two high throughput purification methods for untagged protein targets based on simple and cost-effective self-cleaving intein tags. Two model proteins, E. coli beta-galactosidase (βGal) and superfolder green fluorescent protein (sfGFP), were purified using self-cleaving versions of the conventional chitin-binding domain (CBD) affinity tag and the nonchromatographic elastin-like-polypeptide (ELP) precipitation tag in a 96-well filter plate format. Initial tests with shake flask cultures confirmed that the intein purification scheme could be scaled down, with >90% pure product generated in a single step using both methods. The scheme was then validated in a high throughput expression platform using 24-well plate cultures followed by purification in 96-well plates. For both tags and with both target proteins, the purified product was consistently obtained in a single-step, with low well-to-well and plate-to-plate variability. This simple method thus allows the reproducible production of highly pure untagged recombinant proteins in a convenient microtiter plate format. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. High-Throughput Block Optical DNA Sequence Identification.

    Science.gov (United States)

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  15. X-ray phase microtomography with a single grating for high-throughput investigations of biological tissue.

    Science.gov (United States)

    Zdora, Marie-Christine; Vila-Comamala, Joan; Schulz, Georg; Khimchenko, Anna; Hipp, Alexander; Cook, Andrew C; Dilg, Daniel; David, Christian; Grünzweig, Christian; Rau, Christoph; Thibault, Pierre; Zanette, Irene

    2017-02-01

    The high-throughput 3D visualisation of biological specimens is essential for studying diseases and developmental disorders. It requires imaging methods that deliver high-contrast, high-resolution volumetric information at short sample preparation and acquisition times. Here we show that X-ray phase-contrast tomography using a single grating can provide a powerful alternative to commonly employed techniques, such as high-resolution episcopic microscopy (HREM). We present the phase tomography of a mouse embryo in paraffin obtained with an X-ray single-grating interferometer at I13-2 Beamline at Diamond Light Source and discuss the results in comparison with HREM measurements. The excellent contrast and quantitative density information achieved non-destructively and without staining using a simple, robust setup make X-ray single-grating interferometry an optimum candidate for high-throughput imaging of biological specimens as an alternative for existing methods like HREM.

  16. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  17. High throughput salt separation from uranium deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S.W.; Park, K.M.; Kim, J.G.; Kim, I.T.; Park, S.B., E-mail: swkwon@kaeri.re.kr [Korea Atomic Energy Research Inst. (Korea, Republic of)

    2014-07-01

    It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites in pyroprocessing. Multilayer porous crucible system was proposed to increase a throughput of the salt distiller in this study. An integrated sieve-crucible assembly was also investigated for the practical use of the porous crucible system. The salt evaporation behaviors were compared between the conventional nonporous crucible and the porous crucible. Two step weight reductions took place in the porous crucible, whereas the salt weight reduced only at high temperature by distillation in a nonporous crucible. The first weight reduction in the porous crucible was caused by the liquid salt penetrated out through the perforated crucible during the temperature elevation until the distillation temperature. Multilayer porous crucibles have a benefit to expand the evaporation surface area. (author)

  18. High-throughput screening assay of hepatitis C virus helicase inhibitors using fluorescence-quenching phenomenon

    International Nuclear Information System (INIS)

    Tani, Hidenori; Akimitsu, Nobuyoshi; Fujita, Osamu; Matsuda, Yasuyoshi; Miyata, Ryo; Tsuneda, Satoshi; Igarashi, Masayuki; Sekiguchi, Yuji; Noda, Naohiro

    2009-01-01

    We have developed a novel high-throughput screening assay of hepatitis C virus (HCV) nonstructural protein 3 (NS3) helicase inhibitors using the fluorescence-quenching phenomenon via photoinduced electron transfer between fluorescent dyes and guanine bases. We prepared double-stranded DNA (dsDNA) with a 5'-fluorescent-dye (BODIPY FL)-labeled strand hybridized with a complementary strand, the 3'-end of which has guanine bases. When dsDNA is unwound by helicase, the dye emits fluorescence owing to its release from the guanine bases. Our results demonstrate that this assay is suitable for quantitative assay of HCV NS3 helicase activity and useful for high-throughput screening for inhibitors. Furthermore, we applied this assay to the screening for NS3 helicase inhibitors from cell extracts of microorganisms, and found several cell extracts containing potential inhibitors.

  19. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    BACKGROUND: Unbiased flow cytometry-based methods have become the technique of choice in many laboratories for high-throughput, accurate assessments of malaria parasites in bioassays. A method to quantify live parasites based on mitotracker red CMXRos was recently described but consistent...... distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...... for enumerating live parasites in bioassays was developed. The technique was applied to estimate the specific growth inhibition index (SGI) in the antibody-dependent cellular inhibition (ADCI) assay and compared to parasite quantification by microscopy and mitotracker red staining. The Bland-Altman analysis...

  20. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  1. Crystal Symmetry Algorithms in a High-Throughput Framework for Materials

    Science.gov (United States)

    Taylor, Richard

    The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.

  2. Development of Control Applications for High-Throughput Protein Crystallography Experiments

    International Nuclear Information System (INIS)

    Gaponov, Yurii A.; Matsugaki, Naohiro; Honda, Nobuo; Sasajima, Kumiko; Igarashi, Noriyuki; Hiraki, Masahiko; Yamada, Yusuke; Wakatsuki, Soichi

    2007-01-01

    An integrated client-server control system (PCCS) with a unified relational database (PCDB) has been developed for high-throughput protein crystallography experiments on synchrotron beamlines. The major steps in protein crystallographic experiments (purification, crystallization, crystal harvesting, data collection, and data processing) are integrated into the software. All information necessary for performing protein crystallography experiments is stored in the PCDB database (except raw X-ray diffraction data, which is stored in the Network File Server). To allow all members of a protein crystallography group to participate in experiments, the system was developed as a multi-user system with secure network access based on TCP/IP secure UNIX sockets. Secure remote access to the system is possible from any operating system with X-terminal and SSH/X11 (Secure Shell with graphical user interface) support. Currently, the system covers the high-throughput X-ray data collection stages and is being commissioned at BL5A and NW12A (PF, PF-AR, KEK, Tsukuba, Japan)

  3. High throughput, low set-up time reconfigurable linear feedback shift registers

    NARCIS (Netherlands)

    Nas, R.J.M.; Berkel, van C.H.

    2010-01-01

    This paper presents a hardware design for a scalable, high throughput, configurable LFSR. High throughput is achieved by producing L consecutive outputs per clock cycle with a clock cycle period that, for practical cases, increases only logarithmically with the block size L and the length of the

  4. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  5. DESIGN OF LOW EPI AND HIGH THROUGHPUT CORDIC CELL TO IMPROVE THE PERFORMANCE OF MOBILE ROBOT

    Directory of Open Access Journals (Sweden)

    P. VELRAJKUMAR

    2014-04-01

    Full Text Available This paper mainly focuses on pass logic based design, which gives an low Energy Per Instruction (EPI and high throughput COrdinate Rotation Digital Computer (CORDIC cell for application of robotic exploration. The basic components of CORDIC cell namely register, multiplexer and proposed adder is designed using pass transistor logic (PTL design. The proposed adder is implemented in bit-parallel iterative CORDIC circuit whereas designed using DSCH2 VLSI CAD tool and their layouts are generated by Microwind 3 VLSI CAD tool. The propagation delay, area and power dissipation are calculated from the simulated results for proposed adder based CORDIC cell. The EPI, throughput and effect of temperature are calculated from generated layout. The output parameter of generated layout is analysed using BSIM4 advanced analyzer. The simulated result of the proposed adder based CORDIC circuit is compared with other adder based CORDIC circuits. From the analysis of these simulated results, it was found that the proposed adder based CORDIC circuit dissipates low power, gives faster response, low EPI and high throughput.

  6. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  7. High-throughput selection for cellulase catalysts using chemical complementation.

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  8. Correction of Microplate Data from High-Throughput Screening.

    Science.gov (United States)

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  9. Filter Paper-based Nucleic Acid Storage in High-throughput Solid Tumor Genotyping.

    Science.gov (United States)

    Stachler, Matthew; Jia, Yonghui; Sharaf, Nematullah; Wade, Jacqueline; Longtine, Janina; Garcia, Elizabeth; Sholl, Lynette M

    2015-01-01

    Molecular testing of tumors from formalin-fixed paraffin-embedded (FFPE) tissue blocks is central to clinical practice; however, it requires histology support and increases test turnaround time. Prospective fresh frozen tissue collection requires special handling, additional storage space, and may not be feasible for small specimens. Filter paper-based collection of tumor DNA reduces the need for histology support, requires little storage space, and preserves high-quality nucleic acid. We investigated the performance of tumor smears on filter paper in solid tumor genotyping, as compared with paired FFPE samples. Whatman FTA Micro Card (FTA preps) smears were prepared from 21 fresh tumor samples. A corresponding cytology smear was used to assess tumor cellularity and necrosis. DNA was isolated from FTA preps and FFPE core samples using automated methods and quantified using SYBR green dsDNA detection. Samples were genotyped for 471 mutations on a mass spectrophotometry-based platform (Sequenom). DNA concentrations from FTA preps and FFPE correlated for untreated carcinomas but not for mesenchymal tumors (Spearman σ=0.39 and σ=-0.1, respectively). Average DNA concentrations were lower from FTA preps as compared with FFPE, but DNA quality was higher with less fragmentation. Seventy-six percent of FTA preps and 86% of FFPE samples generated adequate DNA for genotyping. FTA preps tended to perform poorly for collection of DNA from pretreated carcinomas and mesenchymal neoplasms. Of the 16 paired DNA samples that were genotyped, 15 (94%) gave entirely concordant results. Filter paper-based sample preservation is a feasible alternative to FFPE for use in automated, high-throughput genotyping of carcinomas.

  10. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    Science.gov (United States)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  11. Non radioactive precursor import into chloroplasts

    International Nuclear Information System (INIS)

    Lombardo, V.A.; Ottado, J.

    2003-01-01

    Full text: Eukaryotic cells have a subcellular organization based on organelles. Protein transport to these organelles is quantitatively important because the majority of cellular proteins are codified in nuclear genes and then delivered to their final destination. Most of the chloroplast proteins are translated on cytoplasmic ribosomes as larger precursors with an amino terminal transit peptide that is necessary and sufficient to direct the precursor to the chloroplast. Once inside the organelle the transit peptide is cleaved and the mature protein adopts its folded form. In this work we developed a system for the expression and purification of the pea ferredoxin-NADP + reductase precursor (preFNR) for its import into chloroplasts in non radioactive conditions. We constructed a preFNR fused in its carboxy terminus to a 6 histidines peptide (preFNR-6xHis) that allows its identification using a commercial specific antibody. The construction was expressed, purified, processed and precipitated, rendering a soluble and active preFNR-6xHis that was used in binding and import into chloroplasts experiments. The reisolated chloroplasts were analyzed by SDS-PAGE, electro-blotting and revealed by immuno-detection using either colorimetric or chemiluminescent reactive. We performed also import experiments labeling preFNR and preFNR-6xHis with radioactive methionine as controls. We conclude that preFNR-6xHis is bound and imported into chloroplasts as the wild type preFNR and that both colorimetric or chemiluminescent detection methods are useful to avoid the manipulation of radioactive material. (author)

  12. National policy for control of radioactive sources and radioactive waste from non-power applications in Lithuania

    International Nuclear Information System (INIS)

    Klevinskas, G.; Mastauskas, A.

    2001-01-01

    According to the Law on Radiation Protection of the Republic of Lithuania (passed in 1999), the Radiation Protection Centre of the Ministry of Health is the regulatory authority responsible for the radiation protection of public and of workers using sources of ionizing radiation in Lithuania. One of its responsibilities is the control of radioactive sources from the beginning of their 'life cycle', when they are imported in, used, transported and placed as spent into the radioactive waste storage facilities. For the effective control of sources there is national authorization system (notification- registration-licensing) based on the international requirements and recommendations introduced, which also includes keeping and maintaining the Register of Sources, controlling and investigating events while illegally carrying on or in possession of radioactive material, decision making and performing the state radiation protection supervision and control of users of radioactive sources, controlling, within the limits of competence, the radioactive waste management activities in nuclear and non-nuclear power applications. According to the requirements set out in the Law on Radiation Protection and the Government Resolution 'On Establishment of the State Register of the Sources of Ionizing Radiation and Exposure of Workers' (1999) and supplementary legal acts, all licence-holders conducting their activities with sources of ionizing radiation have to present all necessary data to the State Register after annual inventory of sources, after installation of new sources, after decommissioning of sources, after disposal of spent sources, after finishing the activities with the generators of ionizing radiation. The information to the Radiation Protection Centre has to be presented every week from the Customs Department of the Ministry of Finance about all sources of ionizing radiation imported to or exported from Lithuania and the information about the companies performed these

  13. Systems biology of bacterial nitrogen fixation: High-throughput technology and its integrative description with constraint-based modeling

    Directory of Open Access Journals (Sweden)

    Resendis-Antonio Osbaldo

    2011-07-01

    Full Text Available Abstract Background Bacterial nitrogen fixation is the biological process by which atmospheric nitrogen is uptaken by bacteroids located in plant root nodules and converted into ammonium through the enzymatic activity of nitrogenase. In practice, this biological process serves as a natural form of fertilization and its optimization has significant implications in sustainable agricultural programs. Currently, the advent of high-throughput technology supplies with valuable data that contribute to understanding the metabolic activity during bacterial nitrogen fixation. This undertaking is not trivial, and the development of computational methods useful in accomplishing an integrative, descriptive and predictive framework is a crucial issue to decoding the principles that regulated the metabolic activity of this biological process. Results In this work we present a systems biology description of the metabolic activity in bacterial nitrogen fixation. This was accomplished by an integrative analysis involving high-throughput data and constraint-based modeling to characterize the metabolic activity in Rhizobium etli bacteroids located at the root nodules of Phaseolus vulgaris (bean plant. Proteome and transcriptome technologies led us to identify 415 proteins and 689 up-regulated genes that orchestrate this biological process. Taking into account these data, we: 1 extended the metabolic reconstruction reported for R. etli; 2 simulated the metabolic activity during symbiotic nitrogen fixation; and 3 evaluated the in silico results in terms of bacteria phenotype. Notably, constraint-based modeling simulated nitrogen fixation activity in such a way that 76.83% of the enzymes and 69.48% of the genes were experimentally justified. Finally, to further assess the predictive scope of the computational model, gene deletion analysis was carried out on nine metabolic enzymes. Our model concluded that an altered metabolic activity on these enzymes induced

  14. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs.

    Science.gov (United States)

    Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo

    2008-01-23

    To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  15. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    Science.gov (United States)

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  16. Risk-based high-throughput chemical screening and prioritization using exposure models and in vitro bioactivity assays

    International Nuclear Information System (INIS)

    Shin, Hyeong-Moo; Ernstoff, Alexi; Csiszar, Susan A.

    2015-01-01

    We present a risk-based high-throughput screening (HTS) method to identify chemicals for potential health concerns or for which additional information is needed. The method is applied to 180 organic chemicals as a case study. We first obtain information on how the chemical is used and identify relevant use scenarios (e.g., dermal application, indoor emissions). For each chemical and use scenario, exposure models are then used to calculate a chemical intake fraction, or a product intake fraction, accounting for chemical properties and the exposed population. We then combine these intake fractions with use scenario-specific estimates of chemical quantity to calculate daily intake rates (iR; mg/kg/day). These intake rates are compared to oral equivalent doses (OED; mg/kg/day), calculated from a suite of ToxCast in vitro bioactivity assays using in vitro-to-in vivo extrapolation and reverse dosimetry. Bioactivity quotients (BQs) are calculated as iR/OED to obtain estimates of potential impact associated with each relevant use scenario. Of the 180 chemicals considered, 38 had maximum iRs exceeding minimum OEDs (i.e., BQs > 1). For most of these compounds, exposures are associated with direct intake, food/oral contact, or dermal exposure. The method provides high-throughput estimates of exposure and important input for decision makers to identify chemicals of concern for further evaluation with additional information or more refined models

  17. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  18. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-01-01

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  19. The JCSG high-throughput structural biology pipeline

    International Nuclear Information System (INIS)

    Elsliger, Marc-André; Deacon, Ashley M.; Godzik, Adam; Lesley, Scott A.; Wooley, John; Wüthrich, Kurt; Wilson, Ian A.

    2010-01-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years and has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe. The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications

  20. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  1. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    Science.gov (United States)

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  2. Criteria and Processes for the Certification of Non-Radioactive Hazardous and Non-Hazardous Wastes

    International Nuclear Information System (INIS)

    Dominick, J.

    2008-01-01

    This document details Lawrence Livermore National Laboratory's (LLNL) criteria and processes for determining if potentially volumetrically contaminated or potentially surface contaminated wastes are to be managed as material containing residual radioactivity or as non-radioactive. This document updates and replaces UCRL-AR-109662, Criteria and Procedures for the Certification of Nonradioactive Hazardous Waste (Reference 1), also known as 'The Moratorium', and follows the guidance found in the U.S. Department of Energy (DOE) document, Performance Objective for Certification of Non-Radioactive Hazardous Waste (Reference 2). The 1992 Moratorium document (UCRL-AR-109662) is three volumes and 703 pages. The first volume provides an overview of the certification process and lists the key radioanalytical methods and their associated Limits of Sensitivities. Volumes Two and Three contain supporting documents and include over 30 operating procedures, QA plans, training documents and organizational charts that describe the hazardous and radioactive waste management system in place in 1992. This current document is intended to update the previous Moratorium documents and to serve as the top-tier LLNL institutional Moratorium document. The 1992 Moratorium document was restricted to certification of Resource Conservation and Recovery Act (RCRA), State and Toxic Substances Control Act (TSCA) hazardous waste from Radioactive Material Management Areas (RMMA). This still remains the primary focus of the Moratorium; however, this document increases the scope to allow use of this methodology to certify other LLNL wastes and materials destined for off-site disposal, transfer, and re-use including non-hazardous wastes and wastes generated outside of RMMAs with the potential for DOE added radioactivity. The LLNL organization that authorizes off-site transfer/disposal of a material or waste stream is responsible for implementing the requirements of this document. The LLNL Radioactive and

  3. Non-Gaussian Distribution of DNA Barcode Extension In Nanochannels Using High-throughput Imaging

    Science.gov (United States)

    Sheats, Julian; Reinhart, Wesley; Reifenberger, Jeff; Gupta, Damini; Muralidhar, Abhiram; Cao, Han; Dorfman, Kevin

    2015-03-01

    We present experimental data for the extension of internal segments of highly confined DNA using a high-­throughput experimental setup. Barcode­-labeled E. coli genomic DNA molecules were imaged at a high areal density in square nanochannels with sizes ranging from 40 nm to 51 nm in width. Over 25,000 molecules were used to obtain more than 1,000,000 measurements for genomic distances between 2,500 bp and 100,000 bp. The distribution of extensions has positive excess kurtosis and is skew­ left due to weak backfolding in the channel. As a result, the two Odijk theories for the chain extension and variance bracket the experimental data. We compared to predictions of a harmonic approximation for the confinement free energy and show that it produces a substantial error in the variance. These results suggest an inherent error associated with any statistical analysis of barcoded DNA that relies on harmonic models for chain extension. Present address: Department of Chemical and Biological Engineering, Princeton University.

  4. AOPs and Biomarkers: Bridging High Throughput Screening ...

    Science.gov (United States)

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema

  5. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  6. Engineering customized TALE nucleases (TALENs) and TALE transcription factors by fast ligation-based automatable solid-phase high-throughput (FLASH) assembly.

    Science.gov (United States)

    Reyon, Deepak; Maeder, Morgan L; Khayter, Cyd; Tsai, Shengdar Q; Foley, Jonathan E; Sander, Jeffry D; Joung, J Keith

    2013-07-01

    Customized DNA-binding domains made using transcription activator-like effector (TALE) repeats are rapidly growing in importance as widely applicable research tools. TALE nucleases (TALENs), composed of an engineered array of TALE repeats fused to the FokI nuclease domain, have been used successfully for directed genome editing in various organisms and cell types. TALE transcription factors (TALE-TFs), consisting of engineered TALE repeat arrays linked to a transcriptional regulatory domain, have been used to up- or downregulate expression of endogenous genes in human cells and plants. This unit describes a detailed protocol for the recently described fast ligation-based automatable solid-phase high-throughput (FLASH) assembly method. FLASH enables automated high-throughput construction of engineered TALE repeats using an automated liquid handling robot or manually using a multichannel pipet. Using the automated approach, a single researcher can construct up to 96 DNA fragments encoding TALE repeat arrays of various lengths in a single day, and then clone these to construct sequence-verified TALEN or TALE-TF expression plasmids in a week or less. Plasmids required for FLASH are available by request from the Joung lab (http://eGenome.org). This unit also describes improvements to the Zinc Finger and TALE Targeter (ZiFiT Targeter) web server (http://ZiFiT.partners.org) that facilitate the design and construction of FLASH TALE repeat arrays in high throughput. © 2013 by John Wiley & Sons, Inc.

  7. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice

    Science.gov (United States)

    Yang, Wanneng; Guo, Zilong; Huang, Chenglong; Duan, Lingfeng; Chen, Guoxing; Jiang, Ni; Fang, Wei; Feng, Hui; Xie, Weibo; Lian, Xingming; Wang, Gongwei; Luo, Qingming; Zhang, Qifa; Liu, Qian; Xiong, Lizhong

    2014-01-01

    Even as the study of plant genomics rapidly develops through the use of high-throughput sequencing techniques, traditional plant phenotyping lags far behind. Here we develop a high-throughput rice phenotyping facility (HRPF) to monitor 13 traditional agronomic traits and 2 newly defined traits during the rice growth period. Using genome-wide association studies (GWAS) of the 15 traits, we identify 141 associated loci, 25 of which contain known genes such as the Green Revolution semi-dwarf gene, SD1. Based on a performance evaluation of the HRPF and GWAS results, we demonstrate that high-throughput phenotyping has the potential to replace traditional phenotyping techniques and can provide valuable gene identification information. The combination of the multifunctional phenotyping tools HRPF and GWAS provides deep insights into the genetic architecture of important traits. PMID:25295980

  8. 3D material cytometry (3DMaC): a very high-replicate, high-throughput analytical method using microfabricated, shape-specific, cell-material niches.

    Science.gov (United States)

    Parratt, Kirsten; Jeong, Jenny; Qiu, Peng; Roy, Krishnendu

    2017-08-08

    Studying cell behavior within 3D material niches is key to understanding cell biology in health and diseases, and developing biomaterials for regenerative medicine applications. Current approaches to studying these cell-material niches have low throughput and can only analyze a few replicates per experiment resulting in reduced measurement assurance and analytical power. Here, we report 3D material cytometry (3DMaC), a novel high-throughput method based on microfabricated, shape-specific 3D cell-material niches and imaging cytometry. 3DMaC achieves rapid and highly multiplexed analyses of very high replicate numbers ("n" of 10 4 -10 6 ) of 3D biomaterial constructs. 3DMaC overcomes current limitations of low "n", low-throughput, and "noisy" assays, to provide rapid and simultaneous analyses of potentially hundreds of parameters in 3D biomaterial cultures. The method is demonstrated here for a set of 85 000 events containing twelve distinct cell-biomaterial micro-niches along with robust, customized computational methods for high-throughput analytics with potentially unprecedented statistical power.

  9. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  10. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  11. Development and operation of a high-throughput accurate-wavelength lens-based spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Bell, Ronald E., E-mail: rbell@pppl.gov [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

    2014-11-15

    A high-throughput spectrometer for the 400–820 nm wavelength range has been developed for charge exchange recombination spectroscopy or general spectroscopy. A large 2160 mm{sup −1} grating is matched with fast f/1.8 200 mm lenses, which provide stigmatic imaging. A precision optical encoder measures the grating angle with an accuracy ≤0.075 arc sec. A high quantum efficiency low-etaloning CCD detector allows operation at longer wavelengths. A patch panel allows input fibers to interface with interchangeable fiber holders that attach to a kinematic mount at the entrance slit. Computer-controlled hardware allows automated control of wavelength, timing, f-number, automated data collection, and wavelength calibration.

  12. Proceedings of the NEA Workshop on the Management of Non-Nuclear Radioactive Waste

    International Nuclear Information System (INIS)

    Zafiropoulos, Demetre; Dilday, Daniel; Siemann, Michael; Ciambrella, Massimo; Lazo, Edward; Sartori, Enrico; ); Dionisi, Mario; Long, Juliet; Nicholson, David; Chambers, Douglas; Garcia Alves, Joao Henrique; McMahon, Ciara; Bruno, Gerard; Fan, Zhiwen; ); Ripani, Marco; Nielsen, Mette; Solente, Nicolas; Templeton, John; Paratore, Angelo; Feinhals, Joerg; Pandolfi, Dana; Sarchiapone, Lucia; Picentino, Bruno; Simms, Helen; Beer, Hans-Frieder; Deryabin, Sergey; Ulrici, Luisa; Bergamaschi, Carlo; Nottestad, Stacy; Anagnostakis, Marios

    2017-05-01

    All NEA member countries, whether or not they have nuclear power plants, are faced with appropriately managing non-nuclear radioactive waste produced through industrial, research and medical activities. Sources of such waste can include national laboratory and university research activities, used and lost industrial gauges and radiography sources, hospital nuclear medicine activities and in some circumstances, naturally occurring radioactive material (NORM) activities. Although many of these wastes are not long-lived, the shear variety of sources makes it difficult to generically assess their physical (e.g. volume, chemical form, mixed waste) or radiological (e.g. activity, half-life, concentration) characteristics. Additionally, the source-specific nature of these wastes poses questions and challenges to their regulatory and practical management at a national level. This had generated interest from both the radiological protection and radioactive waste management communities, and prompted the Committee on Radiological Protection and Public Health (CRPPH) to organise, in collaboration with the Radioactive Waste Management Committee (RWMC), a workshop tackling some of the key issues of this challenging topic. The key objectives of the NEA Workshop on the Management of Non-Nuclear Radioactive Waste were to address the particularities of managing non-nuclear waste in all its sources and forms and to share and exchange national experiences. Presentations and discussions addressed both technical aspects and national frameworks. Technical aspects included: - the range of non-nuclear waste sources, activities, volumes and other relevant characteristics; - waste storage and repository capacities and life cycles; - safety considerations for mixed wastes management; - human resources and knowledge management; - legal, regulatory and financial assurance, and liability issues. Taking into account the entire non-nuclear waste life-cycle, the workshop covered planning and

  13. Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data

    OpenAIRE

    Gallant, Andrew; Leiserson, Mark DM; Kachalov, Maxim; Cowen, Lenore J; Hescott, Benjamin J

    2013-01-01

    Background New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional ...

  14. High-Throughput Next-Generation Sequencing of Polioviruses

    Science.gov (United States)

    Montmayeur, Anna M.; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J.; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L.; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A.; Oberste, M. Steven; Burns, Cara C.

    2016-01-01

    ABSTRACT The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance. PMID:27927929

  15. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale L

    2005-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  16. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale

    2004-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  17. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  18. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  19. The distribution of radioiodine administrated to pregnant mice and the effect of non radioactive iodide

    International Nuclear Information System (INIS)

    Okui, Toyo; Kobayashi, Satoshi

    1987-01-01

    Radioiodine, 131 I, which has a high fission yield in the nuclear reactor, is easily taken into the human body, accumilating in the thyroid gland, when released to the environment. 131 I was administrated orally to pregnant mice, and its transportation to the tissues, particularly the fetus, was examined closely. And further, the non-radioactive iodide, i.e., KI, was administrated to see its radiation protection effect. The transportation of 131 I to the fetus is the second highest, following the thyroid gland in the mother mouse. This transportation to the fetus becomes the higher, the larger the gestation period at which the 131 I administration is made. The administration of the non-radioactive iodide has large radiation protection effect in the thyroid gland of the mother mouse and of the fetus. But, depending on its concentration, the non-radioactive iodide may conversely increase overall exposure of the fetus. (Mori, K.)

  20. A standardized framework for accurate, high-throughput genotyping of recombinant and non-recombinant viral sequences.

    Science.gov (United States)

    Alcantara, Luiz Carlos Junior; Cassol, Sharon; Libin, Pieter; Deforche, Koen; Pybus, Oliver G; Van Ranst, Marc; Galvão-Castro, Bernardo; Vandamme, Anne-Mieke; de Oliveira, Tulio

    2009-07-01

    Human immunodeficiency virus type-1 (HIV-1), hepatitis B and C and other rapidly evolving viruses are characterized by extremely high levels of genetic diversity. To facilitate diagnosis and the development of prevention and treatment strategies that efficiently target the diversity of these viruses, and other pathogens such as human T-lymphotropic virus type-1 (HTLV-1), human herpes virus type-8 (HHV8) and human papillomavirus (HPV), we developed a rapid high-throughput-genotyping system. The method involves the alignment of a query sequence with a carefully selected set of pre-defined reference strains, followed by phylogenetic analysis of multiple overlapping segments of the alignment using a sliding window. Each segment of the query sequence is assigned the genotype and sub-genotype of the reference strain with the highest bootstrap (>70%) and bootscanning (>90%) scores. Results from all windows are combined and displayed graphically using color-coded genotypes. The new Virus-Genotyping Tools provide accurate classification of recombinant and non-recombinant viruses and are currently being assessed for their diagnostic utility. They have incorporated into several HIV drug resistance algorithms including the Stanford (http://hivdb.stanford.edu) and two European databases (http://www.umcutrecht.nl/subsite/spread-programme/ and http://www.hivrdb.org.uk/) and have been successfully used to genotype a large number of sequences in these and other databases. The tools are a PHP/JAVA web application and are freely accessible on a number of servers including: http://bioafrica.mrc.ac.za/rega-genotype/html/, http://lasp.cpqgm.fiocruz.br/virus-genotype/html/, http://jose.med.kuleuven.be/genotypetool/html/.

  1. A high throughput single nucleotide polymorphism multiplex assay for parentage assignment in New Zealand sheep.

    Directory of Open Access Journals (Sweden)

    Shannon M Clarke

    Full Text Available Accurate pedigree information is critical to animal breeding systems to ensure the highest rate of genetic gain and management of inbreeding. The abundance of available genomic data, together with development of high throughput genotyping platforms, means that single nucleotide polymorphisms (SNPs are now the DNA marker of choice for genomic selection studies. Furthermore the superior qualities of SNPs compared to microsatellite markers allows for standardization between laboratories; a property that is crucial for developing an international set of markers for traceability studies. The objective of this study was to develop a high throughput SNP assay for use in the New Zealand sheep industry that gives accurate pedigree assignment and will allow a reduction in breeder input over lambing. This required two phases of development--firstly, a method of extracting quality DNA from ear-punch tissue performed in a high throughput cost efficient manner and secondly a SNP assay that has the ability to assign paternity to progeny resulting from mob mating. A likelihood based approach to infer paternity was used where sires with the highest LOD score (log of the ratio of the likelihood given parentage to likelihood given non-parentage are assigned. An 84 "parentage SNP panel" was developed that assigned, on average, 99% of progeny to a sire in a problem where there were 3,000 progeny from 120 mob mated sires that included numerous half sib sires. In only 6% of those cases was there another sire with at least a 0.02 probability of paternity. Furthermore dam information (either recorded, or by genotyping possible dams was absent, highlighting the SNP test's suitability for paternity testing. Utilization of this parentage SNP assay will allow implementation of progeny testing into large commercial farms where the improved accuracy of sire assignment and genetic evaluations will increase genetic gain in the sheep industry.

  2. High throughput route selection in multi-rate wireless mesh networks

    Institute of Scientific and Technical Information of China (English)

    WEI Yi-fei; GUO Xiang-li; SONG Mei; SONG Jun-de

    2008-01-01

    Most existing Ad-hoc routing protocols use the shortest path algorithm with a hop count metric to select paths. It is appropriate in single-rate wireless networks, but has a tendency to select paths containing long-distance links that have low data rates and reduced reliability in multi-rate networks. This article introduces a high throughput routing algorithm utilizing the multi-rate capability and some mesh characteristics in wireless fidelity (WiFi) mesh networks. It uses the medium access control (MAC) transmission time as the routing metric, which is estimated by the information passed up from the physical layer. When the proposed algorithm is adopted, the Ad-hoc on-demand distance vector (AODV) routing can be improved as high throughput AODV (HT-AODV). Simulation results show that HT-AODV is capable of establishing a route that has high data-rate, short end-to-end delay and great network throughput.

  3. Small zeolite column tests for removal of cesium from high radioactive contaminated water in Fukushima Daiichi Nuclear Power Station

    International Nuclear Information System (INIS)

    Hijikata, Takatoshi; Uozumi, Koichi; Tukada, Takeshi; Koyama, Tadafumi; Ishikawa, Keiji; Ono, Shoichi; Suzuki, Shunichi; Denton, Mark; Raymont, John

    2011-01-01

    After the earthquake on March 11th 2011, a large amount (more than 0.12 million m 3 ) of highly radioactive contaminated water had pooled in Fukushima Daiichi nuclear power station. As an urgent issue, highly radioactive nuclides should be removed from this contaminated water to reduce radioactivity in the turbine buildings and nuclear reactor buildings. Removal of Cs from this contaminated water is a key issue, because 134 Cs and 137 Cs are highly radioactive γ-emitting nuclides. The zeolite column system was used for Cs and Sr removal from the radioactive water of Three-Mile Island Unit 2, and modified columns were then developed as a Cs removal method for high-level radioactive water in US national laboratories (WRSC, ORNL, PNNL, Hanford, etc.). In order to treat Fukushima's highly contaminated water with a similar system, it was necessary to understand the properties of zeolite to remove Cs from sea salt as well as the applicability of the column system to a high throughput of around 1200 m 3 /d. The kinetic characteristics of the column were another property to be understood before actual operation. Hence, a functional small-scale zeolite column system was installed in CRIEPI for conducting the experiments to understand decontamination behaviors. Each column has a 2- or 3-cm inner diameter and a 12-cm height, and 12 g of zeolite-type media was packed into the column. The column experiments were carried out with Kurion-zeolite, Herschelite, at different feed rates of simulated water with different concentrations of Cs and sea salt. As for the water with 4 ppm Cs and 0 ppm sea salt, only a 10% Cs concentration was observed in the effluent after 20,000 bed volumes were fed at a rate of 33 cm/min, which corresponds to the actual system. On the other hand, a 40% Cs concentration was observed in the effluent after only 50 bed volumes were passed for water with 2 ppm Cs and 3.4 wt.% sea salt at a feed rate of 34 cm/min. As the absorption of Cs is hampered by the

  4. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  5. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  6. HTTK: R Package for High-Throughput Toxicokinetics

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  7. A High-Throughput SU-8Microfluidic Magnetic Bead Separator

    DEFF Research Database (Denmark)

    Bu, Minqiang; Christensen, T. B.; Smistrup, Kristian

    2007-01-01

    We present a novel microfluidic magnetic bead separator based on SU-8 fabrication technique for high through-put applications. The experimental results show that magnetic beads can be captured at an efficiency of 91 % and 54 % at flow rates of 1 mL/min and 4 mL/min, respectively. Integration...... of soft magnetic elements in the chip leads to a slightly higher capturing efficiency and a more uniform distribution of captured beads over the separation chamber than the system without soft magnetic elements....

  8. High-throughput genotyping of single nucleotide polymorphisms with rolling circle amplification

    Directory of Open Access Journals (Sweden)

    Sun Zhenyu

    2001-08-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs are the foundation of powerful complex trait and pharmacogenomic analyses. The availability of large SNP databases, however, has emphasized a need for inexpensive SNP genotyping methods of commensurate simplicity, robustness, and scalability. We describe a solution-based, microtiter plate method for SNP genotyping of human genomic DNA. The method is based upon allele discrimination by ligation of open circle probes followed by rolling circle amplification of the signal using fluorescent primers. Only the probe with a 3' base complementary to the SNP is circularized by ligation. Results SNP scoring by ligation was optimized to a 100,000 fold discrimination against probe mismatched to the SNP. The assay was used to genotype 10 SNPs from a set of 192 genomic DNA samples in a high-throughput format. Assay directly from genomic DNA eliminates the need to preamplify the target as done for many other genotyping methods. The sensitivity of the assay was demonstrated by genotyping from 1 ng of genomic DNA. We demonstrate that the assay can detect a single molecule of the circularized probe. Conclusions Compatibility with homogeneous formats and the ability to assay small amounts of genomic DNA meets the exacting requirements of automated, high-throughput SNP scoring.

  9. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  10. Some legal aspects on high level radioactive waste disposal in Japan

    International Nuclear Information System (INIS)

    Tanabe, Tomoyuki

    1997-01-01

    In Japan, it is considered to be an urgent problem to prepare the system for the research and execution of high level radioactive waste disposal. Under what regulation scheme the disposal should be done has not been sufficiently examined. In this research, the examination was carried out on the legal aspects of high level radioactive waste disposal as follows. First, the current legislation on the disposal in Japan was analyzed, and it was made clear that high level radioactive waste disposal has not been stipulated clearly. Next, on the legal choices which are conceivable on the way the legislation for high level radioactive waste disposal should be, from the aspects of applying the law on regulating nuclear reactors and others, applying the law on nuclear power damage reparation, and industrialization by changing the government ordinances, those were arranged in six choices, and the examination was carried out for each choice from the viewpoints of the relation with the base stipulation for waste-burying business, the speciality of high level radioactive waste disposal as compared with other actions of nuclear power business, the coordination with existing nuclear power of nuclear power business, the coordination with existing nuclear power law system and the formation of national consensus. In this research, it was shown that the execution of high level radioactive waste disposal as the business based on the separate legislation is the realistic choice. (K.I.)

  11. A proposed classification system for high-level and other radioactive wastes

    International Nuclear Information System (INIS)

    Kocher, D.C.; Croff, A.G.

    1987-06-01

    This report presents a proposal for quantitative and generally applicable risk-based definitions of high-level and other radioactive wastes. On the basis of historical descriptions and definitions of high-level waste (HLW), in which HLW has been defined in terms of its source as waste from reprocessing of spent nuclear fuel, we propose a more general definition based on the concept that HLW has two distinct attributes: HLW is (1) highly radioactive and (2) requires permanent isolation. This concept leads to a two-dimensional waste classification system in which one axis, related to ''requires permanent isolation,'' is associated with long-term risks from waste disposal and the other axis, related to ''highly radioactive,'' is associated with shorter-term risks due to high levels of decay heat and external radiation. We define wastes that require permanent isolation as wastes with concentrations of radionuclides exceeding the Class-C limits that are generally acceptable for near-surface land disposal, as specified in the US Nuclear Regulatory Commission's rulemaking 10 CFR Part 61 and its supporting documentation. HLW then is waste requiring permanent isolation that also is highly radioactive, and we define ''highly radioactive'' as a decay heat (power density) in the waste greater than 50 W/m 3 or an external radiation dose rate at a distance of 1 m from the waste greater than 100 rem/h (1 Sv/h), whichever is the more restrictive. This proposal also results in a definition of Transuranic (TRU) Waste and Equivalent as waste that requires permanent isolation but is not highly radioactive and a definition of low-level waste (LLW) as waste that does not require permanent isolation without regard to whether or not it is highly radioactive

  12. High-throughput cell-based screening reveals a role for ZNF131 as a repressor of ERalpha signaling

    Directory of Open Access Journals (Sweden)

    Du Peige

    2008-10-01

    Full Text Available Abstract Background Estrogen receptor α (ERα is a transcription factor whose activity is affected by multiple regulatory cofactors. In an effort to identify the human genes involved in the regulation of ERα, we constructed a high-throughput, cell-based, functional screening platform by linking a response element (ERE with a reporter gene. This allowed the cellular activity of ERα, in cells cotransfected with the candidate gene, to be quantified in the presence or absence of its cognate ligand E2. Results From a library of 570 human cDNA clones, we identified zinc finger protein 131 (ZNF131 as a repressor of ERα mediated transactivation. ZNF131 is a typical member of the BTB/POZ family of transcription factors, and shows both ubiquitous expression and a high degree of sequence conservation. The luciferase reporter gene assay revealed that ZNF131 inhibits ligand-dependent transactivation by ERα in a dose-dependent manner. Electrophoretic mobility shift assay clearly demonstrated that the interaction between ZNF131 and ERα interrupts or prevents ERα binding to the estrogen response element (ERE. In addition, ZNF131 was able to suppress the expression of pS2, an ERα target gene. Conclusion We suggest that the functional screening platform we constructed can be applied for high-throughput genomic screening candidate ERα-related genes. This in turn may provide new insights into the underlying molecular mechanisms of ERα regulation in mammalian cells.

  13. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  14. New approach for high-throughput screening of drug activity on Plasmodium liver stages.

    NARCIS (Netherlands)

    Gego, A.; Silvie, O.; Franetich, J.F.; Farhati, K.; Hannoun, L.; Luty, A.J.F.; Sauerwein, R.W.; Boucheix, C.; Rubinstein, E.; Mazier, D.

    2006-01-01

    Plasmodium liver stages represent potential targets for antimalarial prophylactic drugs. Nevertheless, there is a lack of molecules active on these stages. We have now developed a new approach for the high-throughput screening of drug activity on Plasmodium liver stages in vitro, based on an

  15. Design of a High-Throughput Biological Crystallography Beamline for Superconducting Wiggler

    International Nuclear Information System (INIS)

    Tseng, P.C.; Chang, C.H.; Fung, H.S.; Ma, C.I.; Huang, L.J.; Jean, Y.C.; Song, Y.F.; Huang, Y.S.; Tsang, K.L.; Chen, C.T.

    2004-01-01

    We are constructing a high-throughput biological crystallography beamline BL13B, which utilizes the radiation generated from a 3.2 Tesla, 32-pole superconducting multipole wiggler, for multi-wavelength anomalous diffraction (MAD), single-wavelength anomalous diffraction (SAD), and other related experiments. This beamline is a standard double crystal monochromator (DCM) x-ray beamline equipped with a collimating mirror (CM) and a focusing mirror (FM). Both the CM and FM are one meter long and made of Si substrate, and the CM is side-cooled by water. Based on detailed thermal analysis, liquid nitrogen (LN2) cooling for both crystals of the DCM has been adopted to optimize the energy resolution and photon beam throughput. This beamline will deliver, through a 100 μm diameter pinhole, photon flux of greater than 1011 photons/sec in the energy range from 6.5 keV to 19 keV, which is comparable to existing protein crystallography beamlines from bending magnet source at high energy storage rings

  16. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  17. The development of a high-throughput measurement method of octanol/water distribution coefficient based on hollow fiber membrane solvent microextraction technique.

    Science.gov (United States)

    Bao, James J; Liu, Xiaojing; Zhang, Yong; Li, Youxin

    2014-09-15

    This paper describes the development of a novel high-throughput hollow fiber membrane solvent microextraction technique for the simultaneous measurement of the octanol/water distribution coefficient (logD) for organic compounds such as drugs. The method is based on a designed system, which consists of a 96-well plate modified with 96 hollow fiber membrane tubes and a matching lid with 96 center holes and 96 side holes distributing in 96 grids. Each center hole was glued with a sealed on one end hollow fiber membrane tube, which is used to separate the aqueous phase from the octanol phase. A needle, such as microsyringe or automatic sampler, can be directly inserted into the membrane tube to deposit octanol as the accepted phase or take out the mixture of the octanol and the drug. Each side hole is filled with aqueous phase and could freely take in/out solvent as the donor phase from the outside of the hollow fiber membranes. The logD can be calculated by measuring the drug concentration in each phase after extraction equilibrium. After a comprehensive comparison, the polytetrafluoroethylene hollow fiber with the thickness of 210 μm, an extraction time of 300 min, a temperature of 25 °C and atmospheric pressure without stirring are selected for the high throughput measurement. The correlation coefficient of the linear fit of the logD values of five drugs determined by our system to reference values is 0.9954, showed a nice accurate. The -8.9% intra-day and -4.4% inter-day precision of logD for metronidazole indicates a good precision. In addition, the logD values of eight drugs were simultaneously and successfully measured, which indicated that the 96 throughput measure method of logD value was accurate, precise, reliable and useful for high throughput screening. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. OptoDyCE: Automated system for high-throughput all-optical dynamic cardiac electrophysiology

    Science.gov (United States)

    Klimas, Aleksandra; Yu, Jinzhu; Ambrosi, Christina M.; Williams, John C.; Bien, Harold; Entcheva, Emilia

    2016-02-01

    In the last two decades, market were due to cardiac toxicity, where unintended interactions with ion channels disrupt the heart's normal electrical function. Consequently, all new drugs must undergo preclinical testing for cardiac liability, adding to an already expensive and lengthy process. Recognition that proarrhythmic effects often result from drug action on multiple ion channels demonstrates a need for integrative and comprehensive measurements. Additionally, patient-specific therapies relying on emerging technologies employing stem-cell derived cardiomyocytes (e.g. induced pluripotent stem-cell-derived cardiomyocytes, iPSC-CMs) require better screening methods to become practical. However, a high-throughput, cost-effective approach for cellular cardiac electrophysiology has not been feasible. Optical techniques for manipulation and recording provide a contactless means of dynamic, high-throughput testing of cells and tissues. Here, we consider the requirements for all-optical electrophysiology for drug testing, and we implement and validate OptoDyCE, a fully automated system for all-optical cardiac electrophysiology. We demonstrate the high-throughput capabilities using multicellular samples in 96-well format by combining optogenetic actuation with simultaneous fast high-resolution optical sensing of voltage or intracellular calcium. The system can also be implemented using iPSC-CMs and other cell-types by delivery of optogenetic drivers, or through the modular use of dedicated light-sensitive somatic cells in conjunction with non-modified cells. OptoDyCE provides a truly modular and dynamic screening system, capable of fully-automated acquisition of high-content information integral for improved discovery and development of new drugs and biologics, as well as providing a means of better understanding of electrical disturbances in the heart.

  19. A high throughput live transparent animal bioassay to identify non-toxic small molecules or genes that regulate vertebrate fat metabolism for obesity drug development

    Directory of Open Access Journals (Sweden)

    Woollett Laura A

    2008-08-01

    Full Text Available Abstract Background The alarming rise in the obesity epidemic and growing concern for the pathologic consequences of the metabolic syndrome warrant great need for development of obesity-related pharmacotherapeutics. The search for such therapeutics is severely limited by the slow throughput of animal models of obesity. Amenable to placement into a 96 well plate, zebrafish larvae have emerged as one of the highest throughput vertebrate model organisms for performing small molecule screens. A method for visually identifying non-toxic molecular effectors of fat metabolism using a live transparent vertebrate was developed. Given that increased levels of nicotinamide adenine dinucleotide (NAD via deletion of CD38 have been shown to prevent high fat diet induced obesity in mice in a SIRT-1 dependent fashion we explored the possibility of directly applying NAD to zebrafish. Methods Zebrafish larvae were incubated with daily refreshing of nile red containing media starting from a developmental stage of equivalent fat content among siblings (3 days post-fertilization, dpf and continuing with daily refreshing until 7 dpf. Results PPAR activators, beta-adrenergic agonists, SIRT-1 activators, and nicotinic acid treatment all caused predicted changes in fat, cholesterol, and gene expression consistent with a high degree of evolutionary conservation of fat metabolism signal transduction extending from man to zebrafish larvae. All changes in fat content were visually quantifiable in a relative fashion using live zebrafish larvae nile red fluorescence microscopy. Resveratrol treatment caused the greatest and most consistent loss of fat content. The resveratrol tetramer Vaticanol B caused loss of fat equivalent in potency to resveratrol alone. Significantly, the direct administration of NAD decreased fat content in zebrafish. Results from knockdown of a zebrafish G-PCR ortholog previously determined to decrease fat content in C. elegans support that future GPR

  20. The Belgian approach and status on the radiological surveillance of radioactive substances in metal scrap and non-radioactive waste and the financing of orphan sources

    International Nuclear Information System (INIS)

    Braeckeveldt, Marnix; Preter, Peter De; Michiels, Jan; Pepin, Stephane; Schrauben, Manfred; Wertelaers, An

    2007-01-01

    Numerous facilities in the non-nuclear sector in Belgium (e.g. in the non-radioactive waste processing and management sector and in the metal recycling sector) have been equipped with measuring ports for detecting radioactive substances. These measuring ports prevent radioactive sources or radioactive contamination from ending up in the material fluxes treated by the sectors concerned. They thus play an important part in the protection of the workers and the people living in the neighbourhood of the facilities, as well as in the protection of the population and the environment in general. In 2006, Belgium's federal nuclear control agency (FANC/AFCN) drew up guidelines for the operators of non-nuclear facilities with a measuring port for detecting radioactive substances. These guidelines describe the steps to be followed by the operators when the port's alarm goes off. Following the publication of the European guideline 2003/122/EURATOM of 22 December 2003 on the control of high-activity sealed radioactive sources and orphan sources, a procedure has been drawn up by FANC/AFCN and ONDRAF/NIRAS, the Belgian National Agency for Radioactive Waste and Enriched Fissile Materials, to identify the responsible to cover the costs relating to the further management of detected sealed sources and if not found to declare the sealed source as an orphan source. In this latter case and from mid-2006 the insolvency fund managed by ONDRAF/NIRAS covers the cost of radioactive waste management. At the request of the Belgian government, a financing proposal for the management of unsealed orphan sources as radioactive waste was also established by FANC/AFCN and ONDRAF/NIRAS. This proposal applies the same approach as for sealed sources and thus the financing of unsealed orphan sources will also be covered by the insolvency fund. (authors)

  1. A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.

    2010-01-01

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460

  2. EZH2 and CD79B mutational status over time in B-cell non-Hodgkin lymphomas detected by high-throughput sequencing using minimal samples

    Science.gov (United States)

    Saieg, Mauro Ajaj; Geddie, William R; Boerner, Scott L; Bailey, Denis; Crump, Michael; da Cunha Santos, Gilda

    2013-01-01

    BACKGROUND: Numerous genomic abnormalities in B-cell non-Hodgkin lymphomas (NHLs) have been revealed by novel high-throughput technologies, including recurrent mutations in EZH2 (enhancer of zeste homolog 2) and CD79B (B cell antigen receptor complex-associated protein beta chain) genes. This study sought to determine the evolution of the mutational status of EZH2 and CD79B over time in different samples from the same patient in a cohort of B-cell NHLs, through use of a customized multiplex mutation assay. METHODS: DNA that was extracted from cytological material stored on FTA cards as well as from additional specimens, including archived frozen and formalin-fixed histological specimens, archived stained smears, and cytospin preparations, were submitted to a multiplex mutation assay specifically designed for the detection of point mutations involving EZH2 and CD79B, using MassARRAY spectrometry followed by Sanger sequencing. RESULTS: All 121 samples from 80 B-cell NHL cases were successfully analyzed. Mutations in EZH2 (Y646) and CD79B (Y196) were detected in 13.2% and 8% of the samples, respectively, almost exclusively in follicular lymphomas and diffuse large B-cell lymphomas. In one-third of the positive cases, a wild type was detected in a different sample from the same patient during follow-up. CONCLUSIONS: Testing multiple minimal tissue samples using a high-throughput multiplex platform exponentially increases tissue availability for molecular analysis and might facilitate future studies of tumor progression and the related molecular events. Mutational status of EZH2 and CD79B may vary in B-cell NHL samples over time and support the concept that individualized therapy should be based on molecular findings at the time of treatment, rather than on results obtained from previous specimens. Cancer (Cancer Cytopathol) 2013;121:377–386. © 2013 American Cancer Society. PMID:23361872

  3. A Method of High Throughput Monitoring Crop Physiology Using Chlorophyll Fluorescence and Multispectral Imaging.

    Science.gov (United States)

    Wang, Heng; Qian, Xiangjie; Zhang, Lan; Xu, Sailong; Li, Haifeng; Xia, Xiaojian; Dai, Liankui; Xu, Liang; Yu, Jingquan; Liu, Xu

    2018-01-01

    We present a high throughput crop physiology condition monitoring system and corresponding monitoring method. The monitoring system can perform large-area chlorophyll fluorescence imaging and multispectral imaging. The monitoring method can determine the crop current condition continuously and non-destructively. We choose chlorophyll fluorescence parameters and relative reflectance of multispectral as the indicators of crop physiological status. Using tomato as experiment subject, the typical crop physiological stress, such as drought, nutrition deficiency and plant disease can be distinguished by the monitoring method. Furthermore, we have studied the correlation between the physiological indicators and the degree of stress. Besides realizing the continuous monitoring of crop physiology, the monitoring system and method provide the possibility of machine automatic diagnosis of the plant physiology. Highlights: A newly designed high throughput crop physiology monitoring system and the corresponding monitoring method are described in this study. Different types of stress can induce distinct fluorescence and spectral characteristics, which can be used to evaluate the physiological status of plants.

  4. High-throughput fabrication of anti-counterfeiting colloid-based photoluminescent microtags using electrical nanoimprint lithography

    International Nuclear Information System (INIS)

    Diaz, R; Palleau, E; Poirot, D; Sangeetha, N M; Ressier, L

    2014-01-01

    This work demonstrates the excellent capability of the recently developed electrical nanoimprint lithography (e-NIL) technique for quick, high-throughput production of well-defined colloid assemblies on surfaces. This is shown by fabricating micron-sized photoluminescent quick response (QR) codes based on the electrostatic directed trapping (so called nanoxerography process) of 28 nm colloidal lanthanide-doped upconverting NaYF 4 nanocrystals. Influencing experimental parameters have been optimized and the contribution of triboelectrification in e-NIL was evidenced. Under the chosen conditions, more than 300 000 nanocrystal-based QR codes were fabricated on a 4 inch silicon wafer, in less than 15 min. These microtags were then transferred to transparent flexible films, to be easily integrated onto desired products. Invisible to the naked eye, they can be decoded and authenticated using an optical microscopy image of their specific photoluminescence mapping. Beyond this very promising application for product tracking and the anti-counterfeiting strategies, e-NIL nanoxerography, potentially applicable to any types of charged and/or polarizable colloids and pattern geometries opens up tremendous opportunities for industrial scale production of various other kinds of colloid-based devices and sensors. (paper)

  5. Development of Microfluidic Systems Enabling High-Throughput Single-Cell Protein Characterization

    OpenAIRE

    Fan, Beiyuan; Li, Xiufeng; Chen, Deyong; Peng, Hongshang; Wang, Junbo; Chen, Jian

    2016-01-01

    This article reviews recent developments in microfluidic systems enabling high-throughput characterization of single-cell proteins. Four key perspectives of microfluidic platforms are included in this review: (1) microfluidic fluorescent flow cytometry; (2) droplet based microfluidic flow cytometry; (3) large-array micro wells (microengraving); and (4) large-array micro chambers (barcode microchips). We examine the advantages and limitations of each technique and discuss future research oppor...

  6. Cementification for radioactive waste including high-concentration sodium sulfate and high-concentration radioactive nuclide

    International Nuclear Information System (INIS)

    Miyamoto, Shinya; Sato, Tatsuaki; Sasoh, Michitaka; Sakurai, Jiro; Takada, Takao

    2005-01-01

    For the cementification of radioactive waste that has large concentrations of sodium sulfate and radioactive nuclide, a way of fixation for sulfate ion was studied comprising the pH control of water in contact with the cement solid, and the removal of the excess water from the cement matrix to prevent hydrogen gas generation with radiolysis. It was confirmed that the sulfate ion concentration in the contacted water with the cement solid is decreased with the formation of ettringite or barium sulfate before solidification, the pH value of the pore water in the cement solid can control less than 12.5 by the application of zeolite and a low-alkali cement such as alumina cement or fly ash mixed cement, and removal of the excess water from the cement matrix by heating is possible with aggregate addition. Consequently, radioactive waste including high-concentration sodium sulfate and high-concentration radioactive nuclide can be solidified with cementitious materials. (author)

  7. High-throughput screening to identify inhibitors of lysine demethylases.

    Science.gov (United States)

    Gale, Molly; Yan, Qin

    2015-01-01

    Lysine demethylases (KDMs) are epigenetic regulators whose dysfunction is implicated in the pathology of many human diseases including various types of cancer, inflammation and X-linked intellectual disability. Particular demethylases have been identified as promising therapeutic targets, and tremendous efforts are being devoted toward developing suitable small-molecule inhibitors for clinical and research use. Several High-throughput screening strategies have been developed to screen for small-molecule inhibitors of KDMs, each with advantages and disadvantages in terms of time, cost, effort, reliability and sensitivity. In this Special Report, we review and evaluate the High-throughput screening methods utilized for discovery of novel small-molecule KDM inhibitors.

  8. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  9. Determining the optimal size of small molecule mixtures for high throughput NMR screening

    International Nuclear Information System (INIS)

    Mercier, Kelly A.; Powers, Robert

    2005-01-01

    High-throughput screening (HTS) using NMR spectroscopy has become a common component of the drug discovery effort and is widely used throughout the pharmaceutical industry. NMR provides additional information about the nature of small molecule-protein interactions compared to traditional HTS methods. In order to achieve comparable efficiency, small molecules are often screened as mixtures in NMR-based assays. Nevertheless, an analysis of the efficiency of mixtures and a corresponding determination of the optimum mixture size (OMS) that minimizes the amount of material and instrumentation time required for an NMR screen has been lacking. A model for calculating OMS based on the application of the hypergeometric distribution function to determine the probability of a 'hit' for various mixture sizes and hit rates is presented. An alternative method for the deconvolution of large screening mixtures is also discussed. These methods have been applied in a high-throughput NMR screening assay using a small, directed library

  10. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  11. Handbook of high-level radioactive waste transportation

    International Nuclear Information System (INIS)

    Sattler, L.R.

    1992-10-01

    The High-Level Radioactive Waste Transportation Handbook serves as a reference to which state officials and members of the general public may turn for information on radioactive waste transportation and on the federal government's system for transporting this waste under the Civilian Radioactive Waste Management Program. The Handbook condenses and updates information contained in the Midwestern High-Level Radioactive Waste Transportation Primer. It is intended primarily to assist legislators who, in the future, may be called upon to enact legislation pertaining to the transportation of radioactive waste through their jurisdictions. The Handbook is divided into two sections. The first section places the federal government's program for transporting radioactive waste in context. It provides background information on nuclear waste production in the United States and traces the emergence of federal policy for disposing of radioactive waste. The second section covers the history of radioactive waste transportation; summarizes major pieces of legislation pertaining to the transportation of radioactive waste; and provides an overview of the radioactive waste transportation program developed by the US Department of Energy (DOE). To supplement this information, a summary of pertinent federal and state legislation and a glossary of terms are included as appendices, as is a list of publications produced by the Midwestern Office of The Council of State Governments (CSG-MW) as part of the Midwestern High-Level Radioactive Waste Transportation Project

  12. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    Science.gov (United States)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  13. High level radioactive wastes: Considerations on final disposal

    International Nuclear Information System (INIS)

    Ciallella, Norberto R.

    2000-01-01

    When at the beginnings of the decade of the 80 the National Commission on Atomic Energy (CNEA) in Argentina decided to study the destination of the high level radioactive wastes, was began many investigations, analysis and multidisciplinary evaluations that be origin to a study of characteristics never before carried out in Argentina. For the first time in the country was faced the study of an environmental eventual problem, several decades before that the problem was presented. The elimination of the high level radioactive wastes in the technological aspects was taken in advance, avoiding to transfer the problems to the future generations. The decision was based, not only in technical evaluations but also in ethical premises, since it was considered that the future generations may enjoy the benefits of the nuclear energy and not should be solve the problem. The CNEA in Argentina in 1980 decided to begin a feasibility study and preliminary engineering project for the construction of the final disposal of high level radioactive wastes

  14. Evaluation of radionuclide concentrations in high-level radioactive wastes

    International Nuclear Information System (INIS)

    Fehringer, D.J.

    1985-10-01

    This report describes a possible approach for development of a numerical definition of the term ''high-level radioactive waste.'' Five wastes are identified which are recognized as being high-level wastes under current, non-numerical definitions. The constituents of these wastes are examined and the most hazardous component radionuclides are identified. This report suggests that other wastes with similar concentrations of these radionuclides could also be defined as high-level wastes. 15 refs., 9 figs., 4 tabs

  15. A high-throughput assay of NK cell activity in whole blood and its clinical application

    International Nuclear Information System (INIS)

    Lee, Saet-byul; Cha, Junhoe; Kim, Im-kyung; Yoon, Joo Chun; Lee, Hyo Joon; Park, Sang Woo; Cho, Sunjung; Youn, Dong-Ye; Lee, Heyja; Lee, Choong Hwan; Lee, Jae Myun; Lee, Kang Young; Kim, Jongsun

    2014-01-01

    Graphical abstract: - Highlights: • We demonstrated a simple assay of NK cell activity from whole blood. • The measurement of secreted IFN-γ from NK cell enables high-throughput screening. • The NKA assay was validated by clinical results of colorectal cancer patients. - Abstract: Natural killer (NK) cells are lymphocytes of the innate immune system and have the ability to kill tumor cells and virus-infected cells without prior sensitization. Malignant tumors and viruses have developed, however, strategies to suppress NK cells to escape from their responses. Thus, the evaluation of NK cell activity (NKA) could be invaluable to estimate the status and the outcome of cancers, viral infections, and immune-mediated diseases. Established methods that measure NKA, such as 51 Cr release assay and CD107a degranulation assay, may be used to determine NK cell function, but they are complicated and time-consuming because they require isolation of peripheral blood mononuclear cells (PBMC) or NK cells. In some cases these assays require hazardous material such as radioactive isotopes. To overcome these difficulties, we developed a simple assay that uses whole blood instead of PBMC or isolated NK cells. This novel assay is suitable for high-throughput screening and the monitoring of diseases, because it employs serum of ex vivo stimulated whole blood to detect interferon (IFN)-γ secreted from NK cells as an indicator of NKA. After the stimulation of NK cells, the determination of IFNγ concentration in serum samples by enzyme-linked immunosorbent assay (ELISA) provided a swift, uncomplicated, and high-throughput assay of NKA ex vivo. The NKA results microsatellite stable (MSS) colorectal cancer patients was showed significantly lower NKA, 263.6 ± 54.5 pg/mL compared with healthy subjects, 867.5 ± 50.2 pg/mL (p value <0.0001). Therefore, the NKA could be utilized as a supportive diagnostic marker for microsatellite stable (MSS) colorectal cancer

  16. A high-throughput assay of NK cell activity in whole blood and its clinical application

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Saet-byul [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of); Cha, Junhoe [ATGen Co. Ltd., Sungnam (Korea, Republic of); Kim, Im-kyung [Department of Surgery, Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Yoon, Joo Chun [Department of Microbiology, Ewha Womans University School of Medicine, Seoul (Korea, Republic of); Lee, Hyo Joon [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of); Park, Sang Woo; Cho, Sunjung; Youn, Dong-Ye; Lee, Heyja; Lee, Choong Hwan [ATGen Co. Ltd., Sungnam (Korea, Republic of); Lee, Jae Myun [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of); Lee, Kang Young, E-mail: kylee117@yuhs.ac [Department of Surgery, Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul (Korea, Republic of); Kim, Jongsun, E-mail: jkim63@yuhs.ac [Department of Microbiology and Brain Korea 21 Project for Medical Sciences, Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2014-03-14

    Graphical abstract: - Highlights: • We demonstrated a simple assay of NK cell activity from whole blood. • The measurement of secreted IFN-γ from NK cell enables high-throughput screening. • The NKA assay was validated by clinical results of colorectal cancer patients. - Abstract: Natural killer (NK) cells are lymphocytes of the innate immune system and have the ability to kill tumor cells and virus-infected cells without prior sensitization. Malignant tumors and viruses have developed, however, strategies to suppress NK cells to escape from their responses. Thus, the evaluation of NK cell activity (NKA) could be invaluable to estimate the status and the outcome of cancers, viral infections, and immune-mediated diseases. Established methods that measure NKA, such as {sup 51}Cr release assay and CD107a degranulation assay, may be used to determine NK cell function, but they are complicated and time-consuming because they require isolation of peripheral blood mononuclear cells (PBMC) or NK cells. In some cases these assays require hazardous material such as radioactive isotopes. To overcome these difficulties, we developed a simple assay that uses whole blood instead of PBMC or isolated NK cells. This novel assay is suitable for high-throughput screening and the monitoring of diseases, because it employs serum of ex vivo stimulated whole blood to detect interferon (IFN)-γ secreted from NK cells as an indicator of NKA. After the stimulation of NK cells, the determination of IFNγ concentration in serum samples by enzyme-linked immunosorbent assay (ELISA) provided a swift, uncomplicated, and high-throughput assay of NKA ex vivo. The NKA results microsatellite stable (MSS) colorectal cancer patients was showed significantly lower NKA, 263.6 ± 54.5 pg/mL compared with healthy subjects, 867.5 ± 50.2 pg/mL (p value <0.0001). Therefore, the NKA could be utilized as a supportive diagnostic marker for microsatellite stable (MSS) colorectal cancer.

  17. Macrocell Builder: IP-Block-Based Design Environment for High-Throughput VLSI Dedicated Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Urard Pascal

    2006-01-01

    Full Text Available We propose an efficient IP-block-based design environment for high-throughput VLSI systems. The flow generates SystemC register-transfer-level (RTL architecture, starting from a Matlab functional model described as a netlist of functional IP. The refinement model inserts automatically control structures to manage delays induced by the use of RTL IPs. It also inserts a control structure to coordinate the execution of parallel clocked IP. The delays may be managed by registers or by counters included in the control structure. The flow has been used successfully in three real-world DSP systems. The experimentations show that the approach can produce efficient RTL architecture and allows to save huge amount of time.

  18. Radioactive waste from non-licensed activities - identification of waste, compilation of principles and guidance, and proposed system for final management

    International Nuclear Information System (INIS)

    Jones, C.; Pers, K.

    2001-07-01

    Presently national guidelines for the handling of radioactive waste from non-licensed activities are lacking in Sweden. Results and information presented in this report are intended to form a part of the basis for decisions on further work within the Swedish Radiation Protection Institute on regulations or other guidelines on final management and final disposal of this type of waste. An inventory of radioactive waste from non-licensed activities is presented in the report. In addition, existing rules and principles used in Sweden - and internationally - on the handling of radioactive and toxic waste and non-radioactive material are summarized. Based on these rules and principles a system is suggested for the final management of radioactive material from non-licensed activities. A model is shown for the estimation of dose as a consequence of leaching of radio-nuclides from different deposits. The model is applied on different types of waste, e.g. peat ashes, light concrete and low-level waste from a nuclear installation

  19. Development of high-level radioactive waste treatment and conversion technologies 'Dry decontamination technology development for highly radioactive contaminants'

    International Nuclear Information System (INIS)

    Oh, Won Zin; Lee, K. W.; Won, H. J.; Jung, C. J.; Choi, W. K.; Kim, G. N.; Moon, J. K.

    2001-04-01

    The followings were studied through the project entitled 'Dry Decontamination Technology Development for Highly Radioactive Contaminants'. 1.Contaminant Characteristics Analysis of Domestic Nuclear Fuel Cycle Projects(NFCP) and Applicability Study of the Unit Dry-Decontamination Techniques A. Classification of contaminated equipments and characteristics analysis of contaminants B. Applicability study of the unit dry-decontamination techniques 2.Performance Evaluation of Unit Dry Decontamination Technique A. PFC decontamination technique B. CO2 decontamination technique C. Plasma decontamination technique 3.Development of Residual Radiation Assessment Methodology for High Radioactive Facility Decontamination A. Development of radioactive nuclide diffusion model on highly radioactive facility structure B. Obtainment of the procedure for assessment of residual radiation dose 4.Establishment of the Design Concept of Dry Decontamination Process Equipment Applicable to Highly Radioactive Contaminants 5.TRIGA soil unit decontamination technology development A. Development of soil washing and flushing technologies B. Development of electrokinetic soil decontamination technology

  20. Solion ion source for high-efficiency, high-throughput solar cell manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Koo, John, E-mail: john-koo@amat.com; Binns, Brant; Miller, Timothy; Krause, Stephen; Skinner, Wesley; Mullin, James [Applied Materials, Inc., Varian Semiconductor Equipment Business Unit, 35 Dory Road, Gloucester, Massachusetts 01930 (United States)

    2014-02-15

    In this paper, we introduce the Solion ion source for high-throughput solar cell doping. As the source power is increased to enable higher throughput, negative effects degrade the lifetime of the plasma chamber and the extraction electrodes. In order to improve efficiency, we have explored a wide range of electron energies and determined the conditions which best suit production. To extend the lifetime of the source we have developed an in situ cleaning method using only existing hardware. With these combinations, source life-times of >200 h for phosphorous and >100 h for boron ion beams have been achieved while maintaining 1100 cell-per-hour production.

  1. High-Throughput Classification of Radiographs Using Deep Convolutional Neural Networks.

    Science.gov (United States)

    Rajkomar, Alvin; Lingam, Sneha; Taylor, Andrew G; Blum, Michael; Mongan, John

    2017-02-01

    The study aimed to determine if computer vision techniques rooted in deep learning can use a small set of radiographs to perform clinically relevant image classification with high fidelity. One thousand eight hundred eighty-five chest radiographs on 909 patients obtained between January 2013 and July 2015 at our institution were retrieved and anonymized. The source images were manually annotated as frontal or lateral and randomly divided into training, validation, and test sets. Training and validation sets were augmented to over 150,000 images using standard image manipulations. We then pre-trained a series of deep convolutional networks based on the open-source GoogLeNet with various transformations of the open-source ImageNet (non-radiology) images. These trained networks were then fine-tuned using the original and augmented radiology images. The model with highest validation accuracy was applied to our institutional test set and a publicly available set. Accuracy was assessed by using the Youden Index to set a binary cutoff for frontal or lateral classification. This retrospective study was IRB approved prior to initiation. A network pre-trained on 1.2 million greyscale ImageNet images and fine-tuned on augmented radiographs was chosen. The binary classification method correctly classified 100 % (95 % CI 99.73-100 %) of both our test set and the publicly available images. Classification was rapid, at 38 images per second. A deep convolutional neural network created using non-radiological images, and an augmented set of radiographs is effective in highly accurate classification of chest radiograph view type and is a feasible, rapid method for high-throughput annotation.

  2. Genecentric: a package to uncover graph-theoretic structure in high-throughput epistasis data.

    Science.gov (United States)

    Gallant, Andrew; Leiserson, Mark D M; Kachalov, Maxim; Cowen, Lenore J; Hescott, Benjamin J

    2013-01-18

    New technology has resulted in high-throughput screens for pairwise genetic interactions in yeast and other model organisms. For each pair in a collection of non-essential genes, an epistasis score is obtained, representing how much sicker (or healthier) the double-knockout organism will be compared to what would be expected from the sickness of the component single knockouts. Recent algorithmic work has identified graph-theoretic patterns in this data that can indicate functional modules, and even sets of genes that may occur in compensatory pathways, such as a BPM-type schema first introduced by Kelley and Ideker. However, to date, any algorithms for finding such patterns in the data were implemented internally, with no software being made publically available. Genecentric is a new package that implements a parallelized version of the Leiserson et al. algorithm (J Comput Biol 18:1399-1409, 2011) for generating generalized BPMs from high-throughput genetic interaction data. Given a matrix of weighted epistasis values for a set of double knock-outs, Genecentric returns a list of generalized BPMs that may represent compensatory pathways. Genecentric also has an extension, GenecentricGO, to query FuncAssociate (Bioinformatics 25:3043-3044, 2009) to retrieve GO enrichment statistics on generated BPMs. Python is the only dependency, and our web site provides working examples and documentation. We find that Genecentric can be used to find coherent functional and perhaps compensatory gene sets from high throughput genetic interaction data. Genecentric is made freely available for download under the GPLv2 from http://bcb.cs.tufts.edu/genecentric.

  3. Comparison of Points of Departure for Health Risk Assessment Based on High-Throughput Screening Data

    Science.gov (United States)

    Sand, Salomon; Parham, Fred; Portier, Christopher J.; Tice, Raymond R.; Krewski, Daniel

    2016-01-01

    Background: The National Research Council’s vision for toxicity testing in the 21st century anticipates that points of departure (PODs) for establishing human exposure guidelines in future risk assessments will increasingly be based on in vitro high-throughput screening (HTS) data. Objectives: The aim of this study was to compare different PODs for HTS data. Specifically, benchmark doses (BMDs) were compared to the signal-to-noise crossover dose (SNCD), which has been suggested as the lowest dose applicable as a POD. Methods: Hill models were fit to > 10,000 in vitro concentration–response curves, obtained for > 1,400 chemicals tested as part of the U.S. Tox21 Phase I effort. BMDs and lower confidence limits on the BMDs (BMDLs) corresponding to extra effects (i.e., changes in response relative to the maximum response) of 5%, 10%, 20%, 30%, and 40% were estimated for > 8,000 curves, along with BMDs and BMDLs corresponding to additional effects (i.e., absolute changes in response) of 5%, 10%, 15%, 20%, and 25%. The SNCD, defined as the dose where the ratio between the additional effect and the difference between the upper and lower bounds of the two-sided 90% confidence interval on absolute effect was 1, 0.67, and 0.5, respectively, was also calculated and compared with the BMDLs. Results: The BMDL40, BMDL25, and BMDL18, defined in terms of extra effect, corresponded to the SNCD1.0, SNCD0.67, and SNCD0.5, respectively, at the median. Similarly, the BMDL25, BMDL17, and BMDL13, defined in terms of additional effect, corresponded to the SNCD1.0, SNCD0.67, and SNCD0.5, respectively, at the median. Conclusions: The SNCD may serve as a reference level that guides the determination of standardized BMDs for risk assessment based on HTS concentration–response data. The SNCD may also have application as a POD for low-dose extrapolation. Citation: Sand S, Parham F, Portier CJ, Tice RR, Krewski D. 2017. Comparison of points of departure for health risk assessment based on

  4. Bioprinting-Based High-Throughput Fabrication of Three-Dimensional MCF-7 Human Breast Cancer Cellular Spheroids

    Directory of Open Access Journals (Sweden)

    Kai Ling

    2015-06-01

    Full Text Available Cellular spheroids serving as three-dimensional (3D in vitro tissue models have attracted increasing interest for pathological study and drug-screening applications. Various methods, including microwells in particular, have been developed for engineering cellular spheroids. However, these methods usually suffer from either destructive molding operations or cell loss and non-uniform cell distribution among the wells due to two-step molding and cell seeding. We have developed a facile method that utilizes cell-embedded hydrogel arrays as templates for concave well fabrication and in situ MCF-7 cellular spheroid formation on a chip. A custom-built bioprinting system was applied for the fabrication of sacrificial gelatin arrays and sequentially concave wells in a high-throughput, flexible, and controlled manner. The ability to achieve in situ cell seeding for cellular spheroid construction was demonstrated with the advantage of uniform cell seeding and the potential for programmed fabrication of tissue models on chips. The developed method holds great potential for applications in tissue engineering, regenerative medicine, and drug screening.

  5. Recovering method for high level radioactive material

    International Nuclear Information System (INIS)

    Fukui, Toshiki

    1998-01-01

    Offgas filters such as of nuclear fuel reprocessing facilities and waste control facilities are burnt, and the burnt ash is melted by heating, and then the molten ashes are brought into contact with a molten metal having a low boiling point to transfer the high level radioactive materials in the molten ash to the molten metal. Then, only the molten metal is evaporated and solidified by drying, and residual high level radioactive materials are recovered. According to this method, the high level radioactive materials in the molten ashes are transferred to the molten metal and separated by the difference of the distribution rate of the molten ash and the molten metal. Subsequently, the molten metal to which the high level radioactive materials are transferred is heated to a temperature higher than the boiling point so that only the molten metal is evaporated and dried to be removed, and residual high level radioactive materials are recovered easily. On the other hand, the molten ash from which the high level radioactive material is removed can be discarded as ordinary industrial wastes as they are. (T.M.)

  6. High-throughput spectrometer designs in a compact form-factor: principles and applications

    Science.gov (United States)

    Norton, S. M.

    2013-05-01

    Many compact, portable Raman spectrometers have entered the market in the past few years with applications in narcotics and hazardous material identification, as well as verification applications in pharmaceuticals and security screening. Often, the required compact form-factor has forced designers to sacrifice throughput and sensitivity for portability and low-cost. We will show that a volume phase holographic (VPH)-based spectrometer design can achieve superior throughput and thus sensitivity over conventional Czerny-Turner reflective designs. We will look in depth at the factors influencing throughput and sensitivity and illustrate specific VPH-based spectrometer examples that highlight these design principles.

  7. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  8. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  9. A ground-up approach to High Throughput Cloud Computing in High-Energy Physics

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00245123; Ganis, Gerardo; Bagnasco, Stefano

    The thesis explores various practical approaches in making existing High Throughput computing applications common in High Energy Physics work on cloud-provided resources, as well as opening the possibility for running new applications. The work is divided into two parts: firstly we describe the work done at the computing facility hosted by INFN Torino to entirely convert former Grid resources into cloud ones, eventually running Grid use cases on top along with many others in a more flexible way. Integration and conversion problems are duly described. The second part covers the development of solutions for automatizing the orchestration of cloud workers based on the load of a batch queue and the development of HEP applications based on ROOT's PROOF that can adapt at runtime to a changing number of workers.

  10. Statistical removal of background signals from high-throughput 1H NMR line-broadening ligand-affinity screens

    International Nuclear Information System (INIS)

    Worley, Bradley; Sisco, Nicholas J.; Powers, Robert

    2015-01-01

    NMR ligand-affinity screens are vital to drug discovery, are routinely used to screen fragment-based libraries, and used to verify chemical leads from high-throughput assays and virtual screens. NMR ligand-affinity screens are also a highly informative first step towards identifying functional epitopes of unknown proteins, as well as elucidating the biochemical functions of protein–ligand interaction at their binding interfaces. While simple one-dimensional 1 H NMR experiments are capable of indicating binding through a change in ligand line shape, they are plagued by broad, ill-defined background signals from protein 1 H resonances. We present an uncomplicated method for subtraction of protein background in high-throughput ligand-based affinity screens, and show that its performance is maximized when phase-scatter correction is applied prior to subtraction

  11. A Primer on High-Throughput Computing for Genomic Selection

    Directory of Open Access Journals (Sweden)

    Xiao-Lin eWu

    2011-02-01

    Full Text Available High-throughput computing (HTC uses computer clusters to solve advanced computational problems, with the goal of accomplishing high throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general purpose computation on a graphics processing unit (GPU provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin – Madison, which can be leveraged for genomic selection, in terms of central processing unit (CPU capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of

  12. Regulatory inspection practices for radioactive and non-radioactive waste management facilities

    International Nuclear Information System (INIS)

    Roy, Amitava

    2017-01-01

    Management of nuclear waste plays an important role in the nuclear energy programme of the country. India has adopted the Closed Fuel Cycle option, where the spent nuclear fuel is treated as a material of resource and the nuclear waste is wealth. Closed fuel cycle aims at recovery and recycle of valuable nuclear materials in to reactors as fuel and also separation of useful radio isotopes for the use in health care, agriculture and industry. India has taken a lead role in the waste management activities and has reached a level of maturity over a period of more than forty decades. The nuclear waste management primarily comprises of waste characterization, segregation, conditioning, treatment, immobilization of radionuclides in stable and solid matrices and interim retrievable storage of conditioned solid waste under surveillance. The waste generated in a nuclear facility is in the form of liquid and solid, and it's classification depends on the content of radioactivity. The liquid waste is characterized as Low level (LLW), Intermediate level (ILW) and High Level (HLW). The LLW is relatively large in volume and much lesser radioactive. The LLW is subjected to chemical precipitation using various chemicals based on the radionuclides present, followed by filtration, settling, ion exchange and cement fixation. The conditioning and treatment processes of ILW uses ion exchange, alkali hydrolysis for spent solvent, phase separation and immobilization in cement matrix. The High Level Waste (HLW), generated during spent fuel reprocessing and containing more than 99 percent of the total radioactivity is first subjected to volume reduction/concentration by evaporation and then vitrified in a meIter using borosilicate glass. Presently, Joule Heated Ceramic Meter is used in India for Vitrification process. Vitrified waste products (VWP) are stored for interim period in a multibarrier, air cooled facility under surveillance

  13. Label-free detection of cellular drug responses by high-throughput bright-field imaging and machine learning.

    Science.gov (United States)

    Kobayashi, Hirofumi; Lei, Cheng; Wu, Yi; Mao, Ailin; Jiang, Yiyue; Guo, Baoshan; Ozeki, Yasuyuki; Goda, Keisuke

    2017-09-29

    In the last decade, high-content screening based on multivariate single-cell imaging has been proven effective in drug discovery to evaluate drug-induced phenotypic variations. Unfortunately, this method inherently requires fluorescent labeling which has several drawbacks. Here we present a label-free method for evaluating cellular drug responses only by high-throughput bright-field imaging with the aid of machine learning algorithms. Specifically, we performed high-throughput bright-field imaging of numerous drug-treated and -untreated cells (N = ~240,000) by optofluidic time-stretch microscopy with high throughput up to 10,000 cells/s and applied machine learning to the cell images to identify their morphological variations which are too subtle for human eyes to detect. Consequently, we achieved a high accuracy of 92% in distinguishing drug-treated and -untreated cells without the need for labeling. Furthermore, we also demonstrated that dose-dependent, drug-induced morphological change from different experiments can be inferred from the classification accuracy of a single classification model. Our work lays the groundwork for label-free drug screening in pharmaceutical science and industry.

  14. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  15. Throughput, latency and cost comparisons of microcontroller-based implementations of wireless sensor network (WSN) in high jump sports

    Science.gov (United States)

    Ahmad, Afandi; Roslan, Muhammad Faris; Amira, Abbes

    2017-09-01

    In high jump sports, approach take-off speed and force during the take-off are two (2) main important parts to gain maximum jump. To measure both parameters, wireless sensor network (WSN) that contains microcontroller and sensor are needed to describe the results of speed and force for jumpers. Most of the microcontroller exhibit transmission issues in terms of throughput, latency and cost. Thus, this study presents the comparison of wireless microcontrollers in terms of throughput, latency and cost, and the microcontroller that have best performances and cost will be implemented in high jump wearable device. In the experiments, three (3) parts have been integrated - input, process and output. Force (for ankle) and global positioning system (GPS) sensor (for body waist) acts as an input for data transmission. These data were then being processed by both microcontrollers, ESP8266 and Arduino Yun Mini to transmit the data from sensors to the server (host-PC) via message queuing telemetry transport (MQTT) protocol. The server acts as receiver and the results was calculated from the MQTT log files. At the end, results obtained have shown ESP8266 microcontroller had been chosen since it achieved high throughput, low latency and 11 times cheaper in term of prices compared to Arduino Yun Mini microcontroller.

  16. Association Mapping of Total Carotenoids in Diverse Soybean Genotypes Based on Leaf Extracts and High-Throughput Canopy Spectral Reflectance Measurements.

    Directory of Open Access Journals (Sweden)

    Arun Prabhu Dhanapal

    Full Text Available Carotenoids are organic pigments that are produced predominantly by photosynthetic organisms and provide antioxidant activity to a wide variety of plants, animals, bacteria, and fungi. The carotenoid biosynthetic pathway is highly conserved in plants and occurs mostly in chromoplasts and chloroplasts. Leaf carotenoids play important photoprotective roles and targeted selection for leaf carotenoids may offer avenues to improve abiotic stress tolerance. A collection of 332 soybean [Glycine max (L. Merr.] genotypes was grown in two years and total leaf carotenoid content was determined using three different methods. The first method was based on extraction and spectrophotometric determination of carotenoid content (eCaro in leaf tissue, whereas the other two methods were derived from high-throughput canopy spectral reflectance measurements using wavelet transformed reflectance spectra (tCaro and a spectral reflectance index (iCaro. An association mapping approach was employed using 31,253 single nucleotide polymorphisms (SNPs to identify SNPs associated with total carotenoid content using a mixed linear model based on data from two growing seasons. A total of 28 SNPs showed a significant association with total carotenoid content in at least one of the three approaches. These 28 SNPs likely tagged 14 putative loci for carotenoid content. Six putative loci were identified using eCaro, five loci with tCaro, and nine loci with iCaro. Three of these putative loci were detected by all three carotenoid determination methods. All but four putative loci were located near a known carotenoid-related gene. These results showed that carotenoid markers can be identified in soybean using extract-based as well as by high-throughput canopy spectral reflectance-based approaches, demonstrating the utility of field-based canopy spectral reflectance phenotypes for association mapping.

  17. High-level radioactive wastes. Supplement 1

    International Nuclear Information System (INIS)

    McLaren, L.H.

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations

  18. High-level radioactive waste disposal type and theoretical analyses

    International Nuclear Information System (INIS)

    Lu Yingfa; Wu Yanchun; Luo Xianqi; Cui Yujun

    2006-01-01

    Study of high-level radioactive waste disposal is necessary for the nuclear electrical development; the determination of nuclear waste depository type is one of importance safety. Based on the high-level radioactive disposal type, the relative research subjects are proposed, then the fundamental research characteristics of nuclear waste disposition, for instance: mechanical and hydraulic properties of rock mass, saturated and unsaturated seepage, chemical behaviors, behavior of special soil, and gas behavior, etc. are introduced, the relative coupling equations are suggested, and a one dimensional result is proposed. (authors)

  19. Multiple and high-throughput droplet reactions via combination of microsampling technique and microfluidic chip

    KAUST Repository

    Wu, Jinbo

    2012-11-20

    Microdroplets offer unique compartments for accommodating a large number of chemical and biological reactions in tiny volume with precise control. A major concern in droplet-based microfluidics is the difficulty to address droplets individually and achieve high throughput at the same time. Here, we have combined an improved cartridge sampling technique with a microfluidic chip to perform droplet screenings and aggressive reaction with minimal (nanoliter-scale) reagent consumption. The droplet composition, distance, volume (nanoliter to subnanoliter scale), number, and sequence could be precisely and digitally programmed through the improved sampling technique, while sample evaporation and cross-contamination are effectively eliminated. Our combined device provides a simple model to utilize multiple droplets for various reactions with low reagent consumption and high throughput. © 2012 American Chemical Society.

  20. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    Science.gov (United States)

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  1. Predicting Induced Radioactivity at High Energy Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Fasso, Alberto

    1999-08-27

    Radioactive nuclides are produced at high-energy electron accelerators by different kinds of particle interactions with accelerator components and shielding structures. Radioactivity can also be induced in air, cooling fluids, soil and groundwater. The physical reactions involved include spallations due to the hadronic component of electromagnetic showers, photonuclear reactions by intermediate energy photons and low-energy neutron capture. Although the amount of induced radioactivity is less important than that of proton accelerators by about two orders of magnitude, reliable methods to predict induced radioactivity distributions are essential in order to assess the environmental impact of a facility and to plan its decommissioning. Conventional techniques used so far are reviewed, and a new integrated approach is presented, based on an extension of methods used at proton accelerators and on the unique capability of the FLUKA Monte Carlo code to handle the whole joint electromagnetic and hadronic cascade, scoring residual nuclei produced by all relevant particles. The radiation aspects related to the operation of superconducting RF cavities are also addressed.

  2. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  3. High-throughput microplate technique for enzymatic hydrolysis of lignocellulosic biomass.

    Science.gov (United States)

    Chundawat, Shishir P S; Balan, Venkatesh; Dale, Bruce E

    2008-04-15

    Several factors will influence the viability of a biochemical platform for manufacturing lignocellulosic based fuels and chemicals, for example, genetically engineering energy crops, reducing pre-treatment severity, and minimizing enzyme loading. Past research on biomass conversion has focused largely on acid based pre-treatment technologies that fractionate lignin and hemicellulose from cellulose. However, for alkaline based (e.g., AFEX) and other lower severity pre-treatments it becomes critical to co-hydrolyze cellulose and hemicellulose using an optimized enzyme cocktail. Lignocellulosics are appropriate substrates to assess hydrolytic activity of enzyme mixtures compared to conventional unrealistic substrates (e.g., filter paper, chromogenic, and fluorigenic compounds) for studying synergistic hydrolysis. However, there are few, if any, high-throughput lignocellulosic digestibility analytical platforms for optimizing biomass conversion. The 96-well Biomass Conversion Research Lab (BCRL) microplate method is a high-throughput assay to study digestibility of lignocellulosic biomass as a function of biomass composition, pre-treatment severity, and enzyme composition. The most suitable method for delivering milled biomass to the microplate was through multi-pipetting slurry suspensions. A rapid bio-enzymatic, spectrophotometric assay was used to determine fermentable sugars. The entire procedure was automated using a robotic pipetting workstation. Several parameters that affect hydrolysis in the microplate were studied and optimized (i.e., particle size reduction, slurry solids concentration, glucan loading, mass transfer issues, and time period for hydrolysis). The microplate method was optimized for crystalline cellulose (Avicel) and ammonia fiber expansion (AFEX) pre-treated corn stover. Copyright 2008 Wiley Periodicals, Inc.

  4. Evaluation of a pooled strategy for high-throughput sequencing of cosmid clones from metagenomic libraries.

    Science.gov (United States)

    Lam, Kathy N; Hall, Michael W; Engel, Katja; Vey, Gregory; Cheng, Jiujun; Neufeld, Josh D; Charles, Trevor C

    2014-01-01

    High-throughput sequencing methods have been instrumental in the growing field of metagenomics, with technological improvements enabling greater throughput at decreased costs. Nonetheless, the economy of high-throughput sequencing cannot be fully leveraged in the subdiscipline of functional metagenomics. In this area of research, environmental DNA is typically cloned to generate large-insert libraries from which individual clones are isolated, based on specific activities of interest. Sequence data are required for complete characterization of such clones, but the sequencing of a large set of clones requires individual barcode-based sample preparation; this can become costly, as the cost of clone barcoding scales linearly with the number of clones processed, and thus sequencing a large number of metagenomic clones often remains cost-prohibitive. We investigated a hybrid Sanger/Illumina pooled sequencing strategy that omits barcoding altogether, and we evaluated this strategy by comparing the pooled sequencing results to reference sequence data obtained from traditional barcode-based sequencing of the same set of clones. Using identity and coverage metrics in our evaluation, we show that pooled sequencing can generate high-quality sequence data, without producing problematic chimeras. Though caveats of a pooled strategy exist and further optimization of the method is required to improve recovery of complete clone sequences and to avoid circumstances that generate unrecoverable clone sequences, our results demonstrate that pooled sequencing represents an effective and low-cost alternative for sequencing large sets of metagenomic clones.

  5. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  6. High Resolution Melting (HRM for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Directory of Open Access Journals (Sweden)

    Marcin Słomka

    2017-11-01

    Full Text Available High resolution melting (HRM is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs. This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  7. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  8. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  9. Identifying Inhibitors of Inflammation: A Novel High-Throughput MALDI-TOF Screening Assay for Salt-Inducible Kinases (SIKs).

    Science.gov (United States)

    Heap, Rachel E; Hope, Anthony G; Pearson, Lesley-Anne; Reyskens, Kathleen M S E; McElroy, Stuart P; Hastie, C James; Porter, David W; Arthur, J Simon C; Gray, David W; Trost, Matthias

    2017-12-01

    Matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) mass spectrometry has become a promising alternative for high-throughput drug discovery as new instruments offer high speed, flexibility and sensitivity, and the ability to measure physiological substrates label free. Here we developed and applied high-throughput MALDI TOF mass spectrometry to identify inhibitors of the salt-inducible kinase (SIK) family, which are interesting drug targets in the field of inflammatory disease as they control production of the anti-inflammatory cytokine interleukin-10 (IL-10) in macrophages. Using peptide substrates in in vitro kinase assays, we can show that hit identification of the MALDI TOF kinase assay correlates with indirect ADP-Hunter kinase assays. Moreover, we can show that both techniques generate comparable IC 50 data for a number of hit compounds and known inhibitors of SIK kinases. We further take these inhibitors to a fluorescence-based cellular assay using the SIK activity-dependent translocation of CRTC3 into the nucleus, thereby providing a complete assay pipeline for the identification of SIK kinase inhibitors in vitro and in cells. Our data demonstrate that MALDI TOF mass spectrometry is fully applicable to high-throughput kinase screening, providing label-free data comparable to that of current high-throughput fluorescence assays.

  10. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  11. A high-throughput, multi-channel photon-counting detector with picosecond timing

    International Nuclear Information System (INIS)

    Lapington, J.S.; Fraser, G.W.; Miller, G.M.; Ashton, T.J.R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  12. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    Science.gov (United States)

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  13. The management of high-level radioactive wastes

    International Nuclear Information System (INIS)

    Lennemann, Wm.L.

    1979-01-01

    The definition of high-level radioactive wastes is given. The following aspects of high-level radioactive wastes' management are discussed: fuel reprocessing and high-level waste; storage of high-level liquid waste; solidification of high-level waste; interim storage of solidified high-level waste; disposal of high-level waste; disposal of irradiated fuel elements as a waste

  14. Screening of HIV-1 Protease Using a Combination of an Ultra-High-Throughput Fluorescent-Based Assay and RapidFire Mass Spectrometry.

    Science.gov (United States)

    Meng, Juncai; Lai, Ming-Tain; Munshi, Vandna; Grobler, Jay; McCauley, John; Zuck, Paul; Johnson, Eric N; Uebele, Victor N; Hermes, Jeffrey D; Adam, Gregory C

    2015-06-01

    HIV-1 protease (PR) represents one of the primary targets for developing antiviral agents for the treatment of HIV-infected patients. To identify novel PR inhibitors, a label-free, high-throughput mass spectrometry (HTMS) assay was developed using the RapidFire platform and applied as an orthogonal assay to confirm hits identified in a fluorescence resonance energy transfer (FRET)-based primary screen of > 1 million compounds. For substrate selection, a panel of peptide substrates derived from natural processing sites for PR was evaluated on the RapidFire platform. As a result, KVSLNFPIL, a new substrate measured to have a ~ 20- and 60-fold improvement in k cat/K m over the frequently used sequences SQNYPIVQ and SQNYPIV, respectively, was identified for the HTMS screen. About 17% of hits from the FRET-based primary screen were confirmed in the HTMS confirmatory assay including all 304 known PR inhibitors in the set, demonstrating that the HTMS assay is effective at triaging false-positives while capturing true hits. Hence, with a sampling rate of ~7 s per well, the RapidFire HTMS assay enables the high-throughput evaluation of peptide substrates and functions as an efficient tool for hits triage in the discovery of novel PR inhibitors. © 2015 Society for Laboratory Automation and Screening.

  15. High-level radioactive wastes. Supplement 1

    Energy Technology Data Exchange (ETDEWEB)

    McLaren, L.H. (ed.)

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations.

  16. Recent Advances in Nanobiotechnology and High-Throughput Molecular Techniques for Systems Biomedicine

    Science.gov (United States)

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-01-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011

  17. Evaluation of Capacity on a High Throughput Vol-oxidizer for Operability

    International Nuclear Information System (INIS)

    Kim, Young Hwan; Park, Geun Il; Lee, Jung Won; Jung, Jae Hoo; Kim, Ki Ho; Lee, Yong Soon; Lee, Do Youn; Kim, Su Sung

    2010-01-01

    KAERI is developing a pyro-process. As a piece of process equipment, a high throughput vol-oxidizer which can handle a several tens kg HM/batch was developed to supply U 3 O 8 powders to an electrolytic reduction(ER) reactor. To increase the reduction yield, UO 2 pellets should be converted into uniform powders. In this paper, we aim at the evaluation of a high throughput vol-oxidizer for operability. The evaluation consisted of 3 targets, a mechanical motion test, a heating test and hull separation test. In order to test a high throughput vol-oxidizer, By using a control system, mechanical motion tests of the vol-oxidizer were conducted, and heating rates were analyzed. Also the separation tests of hulls for recovery rate were conducted. The test results of the vol-oxidizer are going to be applied for operability. A study on the characteristics of the volatile gas produced during a vol-oxidation process is not included in this study

  18. Fun with High Throughput Toxicokinetics (CalEPA webinar)

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  19. iMir: an integrated pipeline for high-throughput analysis of small non-coding RNA data obtained by smallRNA-Seq.

    Science.gov (United States)

    Giurato, Giorgio; De Filippo, Maria Rosaria; Rinaldi, Antonio; Hashim, Adnan; Nassa, Giovanni; Ravo, Maria; Rizzo, Francesca; Tarallo, Roberta; Weisz, Alessandro

    2013-12-13

    Qualitative and quantitative analysis of small non-coding RNAs by next generation sequencing (smallRNA-Seq) represents a novel technology increasingly used to investigate with high sensitivity and specificity RNA population comprising microRNAs and other regulatory small transcripts. Analysis of smallRNA-Seq data to gather biologically relevant information, i.e. detection and differential expression analysis of known and novel non-coding RNAs, target prediction, etc., requires implementation of multiple statistical and bioinformatics tools from different sources, each focusing on a specific step of the analysis pipeline. As a consequence, the analytical workflow is slowed down by the need for continuous interventions by the operator, a critical factor when large numbers of datasets need to be analyzed at once. We designed a novel modular pipeline (iMir) for comprehensive analysis of smallRNA-Seq data, comprising specific tools for adapter trimming, quality filtering, differential expression analysis, biological target prediction and other useful options by integrating multiple open source modules and resources in an automated workflow. As statistics is crucial in deep-sequencing data analysis, we devised and integrated in iMir tools based on different statistical approaches to allow the operator to analyze data rigorously. The pipeline created here proved to be efficient and time-saving than currently available methods and, in addition, flexible enough to allow the user to select the preferred combination of analytical steps. We present here the results obtained by applying this pipeline to analyze simultaneously 6 smallRNA-Seq datasets from either exponentially growing or growth-arrested human breast cancer MCF-7 cells, that led to the rapid and accurate identification, quantitation and differential expression analysis of ~450 miRNAs, including several novel miRNAs and isomiRs, as well as identification of the putative mRNA targets of differentially expressed mi

  20. A comparison of the effect of 5-bromodeoxyuridine substitution on 33258 Hoechst- and DAPI-fluorescence of isolated chromosomes by bivariate flow karyotyping

    NARCIS (Netherlands)

    Buys, C. H.; Mesa, J.; van der Veen, A. Y.; Aten, J. A.

    1986-01-01

    Application of the fluorescent DNA-intercalator propidium iodide for stabilization of the mitotic chromosome structure during isolation of chromosomes from V79 Chinese hamster cells and subsequent staining with the fluorochromes 33258 Hoechst or DAPI allowed bivariate flow karyotyping of isolated

  1. Development and evaluation of a novel high-throughput image-based fluorescent neutralization test for detection of Zika virus infection.

    Science.gov (United States)

    Koishi, Andrea Cristine; Suzukawa, Andréia Akemi; Zanluca, Camila; Camacho, Daria Elena; Comach, Guillermo; Duarte Dos Santos, Claudia Nunes

    2018-03-01

    Zika virus (ZIKV) is an emerging arbovirus belonging to the genus flavivirus that comprises other important public health viruses, such as dengue (DENV) and yellow fever (YFV). In general, ZIKV infection is a self-limiting disease, however cases of Guillain-Barré syndrome and congenital brain abnormalities in newborn infants have been reported. Diagnosing ZIKV infection remains a challenge, as viral RNA detection is only applicable until a few days after the onset of symptoms. After that, serological tests must be applied, and, as expected, high cross-reactivity between ZIKV and other flavivirus serology is observed. Plaque reduction neutralization test (PRNT) is indicated to confirm positive samples for being more specific, however it is laborious intensive and time consuming, representing a major bottleneck for patient diagnosis. To overcome this limitation, we developed a high-throughput image-based fluorescent neutralization test for ZIKV infection by serological detection. Using 226 human specimens, we showed that the new test presented higher throughput than traditional PRNT, maintaining the correlation between results. Furthermore, when tested with dengue virus samples, it showed 50.53% less cross reactivity than MAC-ELISA. This fluorescent neutralization test could be used for clinical diagnosis confirmation of ZIKV infection, as well as for vaccine clinical trials and seroprevalence studies.

  2. High throughput generated micro-aggregates of chondrocytes stimulate cartilage formation in vitro and in vivo

    NARCIS (Netherlands)

    Moreira Teixeira, Liliana; Leijten, Jeroen Christianus Hermanus; Sobral, J.; Jin, R.; van Apeldoorn, Aart A.; Feijen, Jan; van Blitterswijk, Clemens; Dijkstra, Pieter J.; Karperien, Hermanus Bernardus Johannes

    2012-01-01

    Cell-based cartilage repair strategies such as matrix-induced autologous chondrocyte implantation (MACI) could be improved by enhancing cell performance. We hypothesised that micro-aggregates of chondrocytes generated in high-throughput prior to implantation in a defect could stimulate cartilaginous

  3. Searching for resistance genes to Bursaphelenchus xylophilus using high throughput screening

    Directory of Open Access Journals (Sweden)

    Santos Carla S

    2012-11-01

    Full Text Available Abstract Background Pine wilt disease (PWD, caused by the pinewood nematode (PWN; Bursaphelenchus xylophilus, damages and kills pine trees and is causing serious economic damage worldwide. Although the ecological mechanism of infestation is well described, the plant’s molecular response to the pathogen is not well known. This is due mainly to the lack of genomic information and the complexity of the disease. High throughput sequencing is now an efficient approach for detecting the expression of genes in non-model organisms, thus providing valuable information in spite of the lack of the genome sequence. In an attempt to unravel genes potentially involved in the pine defense against the pathogen, we hereby report the high throughput comparative sequence analysis of infested and non-infested stems of Pinus pinaster (very susceptible to PWN and Pinus pinea (less susceptible to PWN. Results Four cDNA libraries from infested and non-infested stems of P. pinaster and P. pinea were sequenced in a full 454 GS FLX run, producing a total of 2,083,698 reads. The putative amino acid sequences encoded by the assembled transcripts were annotated according to Gene Ontology, to assign Pinus contigs into Biological Processes, Cellular Components and Molecular Functions categories. Most of the annotated transcripts corresponded to Picea genes-25.4-39.7%, whereas a smaller percentage, matched Pinus genes, 1.8-12.8%, probably a consequence of more public genomic information available for Picea than for Pinus. The comparative transcriptome analysis showed that when P. pinaster was infested with PWN, the genes malate dehydrogenase, ABA, water deficit stress related genes and PAR1 were highly expressed, while in PWN-infested P. pinea, the highly expressed genes were ricin B-related lectin, and genes belonging to the SNARE and high mobility group families. Quantitative PCR experiments confirmed the differential gene expression between the two pine species

  4. Searching for resistance genes to Bursaphelenchus xylophilus using high throughput screening

    Science.gov (United States)

    2012-01-01

    Background Pine wilt disease (PWD), caused by the pinewood nematode (PWN; Bursaphelenchus xylophilus), damages and kills pine trees and is causing serious economic damage worldwide. Although the ecological mechanism of infestation is well described, the plant’s molecular response to the pathogen is not well known. This is due mainly to the lack of genomic information and the complexity of the disease. High throughput sequencing is now an efficient approach for detecting the expression of genes in non-model organisms, thus providing valuable information in spite of the lack of the genome sequence. In an attempt to unravel genes potentially involved in the pine defense against the pathogen, we hereby report the high throughput comparative sequence analysis of infested and non-infested stems of Pinus pinaster (very susceptible to PWN) and Pinus pinea (less susceptible to PWN). Results Four cDNA libraries from infested and non-infested stems of P. pinaster and P. pinea were sequenced in a full 454 GS FLX run, producing a total of 2,083,698 reads. The putative amino acid sequences encoded by the assembled transcripts were annotated according to Gene Ontology, to assign Pinus contigs into Biological Processes, Cellular Components and Molecular Functions categories. Most of the annotated transcripts corresponded to Picea genes-25.4-39.7%, whereas a smaller percentage, matched Pinus genes, 1.8-12.8%, probably a consequence of more public genomic information available for Picea than for Pinus. The comparative transcriptome analysis showed that when P. pinaster was infested with PWN, the genes malate dehydrogenase, ABA, water deficit stress related genes and PAR1 were highly expressed, while in PWN-infested P. pinea, the highly expressed genes were ricin B-related lectin, and genes belonging to the SNARE and high mobility group families. Quantitative PCR experiments confirmed the differential gene expression between the two pine species. Conclusions Defense-related genes

  5. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  6. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    International Nuclear Information System (INIS)

    Hui Su

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm(sub 2) for 40-(micro)m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection

  7. High throughput experimentation for the discovery of new catalysts

    International Nuclear Information System (INIS)

    Thomson, S.; Hoffmann, C.; Johann, T.; Wolf, A.; Schmidt, H.-W.; Farrusseng, D.; Schueth, F.

    2002-01-01

    Full text: The use of combinatorial chemistry to obtain new materials has been developed extensively by the pharmaceutical and biochemical industries, but such approaches have been slow to impact on the field of heterogeneous catalysis. The reasons for this lie in with difficulties associated in the synthesis, characterisation and determination of catalytic properties of such materials. In many synthetic and catalytic reactions, the conditions used are difficult to emulate using High Throughput Experimentation (HTE). Furthermore, the ability to screen these catalysts simultaneously in real time, requires the development and/or modification of characterisation methods. Clearly, there is a need for both high throughput synthesis and screening of new and novel reactions, and we describe several new concepts that help to achieve these goals. Although such problems have impeded the development of combinatorial catalysis, the fact remains that many highly attractive processes still exist for which no suitable catalysts have been developed. The ability to decrease the tiFme needed to evaluate catalyst is therefore essential and this makes the use of high throughput techniques highly desirable. In this presentation we will describe the synthesis, catalytic testing, and novel screening methods developed at the Max Planck Institute. Automated synthesis procedures, performed by the use of a modified Gilson pipette robot, will be described, as will the development of two 16 and 49 sample fixed bed reactors and two 25 and 29 sample three phase reactors for catalytic testing. We will also present new techniques for the characterisation of catalysts and catalytic products using standard IR microscopy and infrared focal plane array detection, respectively

  8. Validation of a high-throughput fermentation system based on online monitoring of biomass and fluorescence in continuously shaken microtiter plates

    Directory of Open Access Journals (Sweden)

    Kensy Frank

    2009-06-01

    Full Text Available Abstract Background An advanced version of a recently reported high-throughput fermentation system with online measurement, called BioLector, and its validation is presented. The technology combines high-throughput screening and high-information content by applying online monitoring of scattered light and fluorescence intensities in continuously shaken microtiter plates. Various examples in calibration of the optical measurements, clone and media screening and promoter characterization are given. Results Bacterial and yeast biomass concentrations of up to 50 g/L cell dry weight could be linearly correlated to scattered light intensities. In media screening, the BioLector could clearly demonstrate its potential for detecting different biomass and product yields and deducing specific growth rates for quantitatively evaluating media and nutrients. Growth inhibition due to inappropriate buffer conditions could be detected by reduced growth rates and a temporary increase in NADH fluorescence. GFP served very well as reporter protein for investigating the promoter regulation under different carbon sources in yeast strains. A clone screening of 90 different GFP-expressing Hansenula polymorpha clones depicted the broad distribution of growth behavior and an even stronger distribution in GFP expression. The importance of mass transfer conditions could be demonstrated by varying filling volumes of an E. coli culture in 96 well MTP. The different filling volumes cause a deviation in the culture growth and acidification both monitored via scattered light intensities and the fluorescence of a pH indicator, respectively. Conclusion The BioLector technology is a very useful tool to perform quantitative microfermentations under engineered reaction conditions. With this technique, specific yields and rates can be directly deduced from online biomass and product concentrations, which is superior to existing technologies such as microplate readers or optode-based

  9. Use of Threshold of Toxicological Concern (TTC) with High Throughput Exposure Predictions as a Risk-Based Screening Approach to Prioritize More Than Seven Thousand Chemicals (ASCCT)

    Science.gov (United States)

    Here, we present results of an approach for risk-based prioritization using the Threshold of Toxicological Concern (TTC) combined with high-throughput exposure (HTE) modelling. We started with 7968 chemicals with calculated population median oral daily intakes characterized by an...

  10. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  11. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  12. High-throughput screening of saliva for early detection of oral cancer: a pilot study.

    Science.gov (United States)

    Szanto, I; Mark, L; Bona, A; Maasz, G; Sandor, B; Gelencser, G; Turi, Z; Gallyas, F

    2012-04-01

    The success of tumour therapy depends considerably on early diagnosis. Therefore, we aimed to develop a widely available, cheap, non-invasive, high-throughput method suitable for screening high-risk populations, at least, for early signs of malignant transformation in the oral cavity. First, in order to identify suitable tumour marker candidates, we compared the protein patterns of five selected saliva samples obtained from healthy controls and tumour patients after electrophoretic separation, excised the bands that were consistently up-regulated in the tumour patients only, and performed matrix-assisted laser-desorption ionisation (MALDI)-time of flight (TOF) tandem mass spectrometry (MS/MS) analysis of the proteins in these bands after in-gel tryptic digestion. From the panel of proteins identified, we chose annexin 1 and peroxiredoxin 2 for further studies based on their presence in the saliva of all five oral cancer patients only. Then, we performed a homology search of protein databases using the primary sequence of each in silico tryptic fragment peptide of these two proteins as bait, and selected a unique peptide for each. Finally, we performed targeted MALDI-TOF MS peptide analysis in a blinded fashion on all samples obtained from 20 healthy controls and 22 tumour patients for the presence of these peptides. We found both peptides present in the saliva samples of all cancer patients only. Even though these tumour markers should be validated in a wider population, our results indicate that targeted MALDI-TOF MS analysis of unique peptides of putative saliva protein tumour biomarkers could be the method of choice for cost-efficient, high-throughput screening for the early detection of oral cancer.

  13. Towards low-delay and high-throughput cognitive radio vehicular networks

    Directory of Open Access Journals (Sweden)

    Nada Elgaml

    2017-12-01

    Full Text Available Cognitive Radio Vehicular Ad-hoc Networks (CR-VANETs exploit cognitive radios to allow vehicles to access the unused channels in their radio environment. Thus, CR-VANETs do not only suffer the traditional CR problems, especially spectrum sensing, but also suffer new challenges due to the highly dynamic nature of VANETs. In this paper, we present a low-delay and high-throughput radio environment assessment scheme for CR-VANETs that can be easily incorporated with the IEEE 802.11p standard developed for VANETs. Simulation results show that the proposed scheme significantly reduces the time to get the radio environment map and increases the CR-VANET throughput.

  14. Survey on non-nuclear radioactive waste; Kartlaeggning av radioaktivt avfall fraan icke kaernteknisk verksamhet

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-11-01

    On request from the Swedish Radiation Protection Authority, the Swedish government has in May 2002 set up a non-standing committee for non-nuclear radioactive waste. The objective was to elaborate proposals for a national system for the management of all types of non-nuclear radioactive wastes with special consideration of inter alia the polluter pays principle and the responsibility of the producers. The committee will deliver its proposals to the government 1 December 2003. SSI has assisted the committee to the necessary extent to fulfill the investigation. This report is a summery of SSI's background material concerning non-nuclear radioactive waste in Sweden.

  15. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    Science.gov (United States)

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  16. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  17. 40 CFR 227.30 - High-level radioactive waste.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste from...

  18. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  19. Quantitative in vitro-to-in vivo extrapolation in a high-throughput environment

    International Nuclear Information System (INIS)

    Wetmore, Barbara A.

    2015-01-01

    High-throughput in vitro toxicity screening provides an efficient way to identify potential biological targets for environmental and industrial chemicals while conserving limited testing resources. However, reliance on the nominal chemical concentrations in these in vitro assays as an indicator of bioactivity may misrepresent potential in vivo effects of these chemicals due to differences in clearance, protein binding, bioavailability, and other pharmacokinetic factors. Development of high-throughput in vitro hepatic clearance and protein binding assays and refinement of quantitative in vitro-to-in vivo extrapolation (QIVIVE) methods have provided key tools to predict xenobiotic steady state pharmacokinetics. Using a process known as reverse dosimetry, knowledge of the chemical steady state behavior can be incorporated with HTS data to determine the external in vivo oral exposure needed to achieve internal blood concentrations equivalent to those eliciting bioactivity in the assays. These daily oral doses, known as oral equivalents, can be compared to chronic human exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. This review will describe the use of QIVIVE methods in a high-throughput environment and the promise they hold in shaping chemical testing priorities and, potentially, high-throughput risk assessment strategies

  20. Remote detection of radioactive material using high-power pulsed electromagnetic radiation.

    Science.gov (United States)

    Kim, Dongsung; Yu, Dongho; Sawant, Ashwini; Choe, Mun Seok; Lee, Ingeun; Kim, Sung Gug; Choi, EunMi

    2017-05-09

    Remote detection of radioactive materials is impossible when the measurement location is far from the radioactive source such that the leakage of high-energy photons or electrons from the source cannot be measured. Current technologies are less effective in this respect because they only allow the detection at distances to which the high-energy photons or electrons can reach the detector. Here we demonstrate an experimental method for remote detection of radioactive materials by inducing plasma breakdown with the high-power pulsed electromagnetic waves. Measurements of the plasma formation time and its dispersion lead to enhanced detection sensitivity compared to the theoretically predicted one based only on the plasma on and off phenomena. We show that lower power of the incident electromagnetic wave is sufficient for plasma breakdown in atmospheric-pressure air and the elimination of the statistical distribution is possible in the presence of radioactive material.

  1. Development of high-level radioactive waste treatment and conversion technologies 'Dry decontamination technology development for highly radioactive contaminants'

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Won Zin; Lee, K. W.; Won, H. J.; Jung, C. J.; Choi, W. K.; Kim, G. N.; Moon, J. K

    2001-04-01

    The followings were studied through the project entitled 'Dry Decontamination Technology Development for Highly Radioactive Contaminants'. 1.Contaminant Characteristics Analysis of Domestic Nuclear Fuel Cycle Projects(NFCP) and Applicability Study of the Unit Dry-Decontamination Techniques A. Classification of contaminated equipments and characteristics analysis of contaminants B. Applicability study of the unit dry-decontamination techniques 2.Performance Evaluation of Unit Dry Decontamination Technique A. PFC decontamination technique B. CO2 decontamination technique C. Plasma decontamination technique 3.Development of Residual Radiation Assessment Methodology for High Radioactive Facility Decontamination A. Development of radioactive nuclide diffusion model on highly radioactive facility structure B. Obtainment of the procedure for assessment of residual radiation dose 4.Establishment of the Design Concept of Dry Decontamination Process Equipment Applicable to Highly Radioactive Contaminants 5.TRIGA soil unit decontamination technology development A. Development of soil washing and flushing technologies B. Development of electrokinetic soil decontamination technology.

  2. Raman-Activated Droplet Sorting (RADS) for Label-Free High-Throughput Screening of Microalgal Single-Cells.

    Science.gov (United States)

    Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo

    2017-11-21

    Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.

  3. Quality control methodology for high-throughput protein-protein interaction screening.

    Science.gov (United States)

    Vazquez, Alexei; Rual, Jean-François; Venkatesan, Kavitha

    2011-01-01

    Protein-protein interactions are key to many aspects of the cell, including its cytoskeletal structure, the signaling processes in which it is involved, or its metabolism. Failure to form protein complexes or signaling cascades may sometimes translate into pathologic conditions such as cancer or neurodegenerative diseases. The set of all protein interactions between the proteins encoded by an organism constitutes its protein interaction network, representing a scaffold for biological function. Knowing the protein interaction network of an organism, combined with other sources of biological information, can unravel fundamental biological circuits and may help better understand the molecular basics of human diseases. The protein interaction network of an organism can be mapped by combining data obtained from both low-throughput screens, i.e., "one gene at a time" experiments and high-throughput screens, i.e., screens designed to interrogate large sets of proteins at once. In either case, quality controls are required to deal with the inherent imperfect nature of experimental assays. In this chapter, we discuss experimental and statistical methodologies to quantify error rates in high-throughput protein-protein interactions screens.

  4. A high throughput architecture for a low complexity soft-output demapping algorithm

    Science.gov (United States)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  5. Hydrological performance assessment on siting the high level radioactive waste repository

    International Nuclear Information System (INIS)

    Guo Yonghai; Liu Shufen; Wang Ju; Wang Zhiming; Su Rui; Lv Chuanhe; Zong Zihua

    2007-01-01

    Based on the research experiences in China and some developed countries in the world, the processes and methods on hydrological performance assessment for the siting of high radioactive repository are discussed in this paper. The methods and contents of hydrological performance assessment are discussed respectively for region, area and site hydrological investigation stages. At the same time, the hydrological performance assessment of the potential site for high level radioactive waste in China is introduced. (authors)

  6. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  7. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy

    International Nuclear Information System (INIS)

    Lai Yiuwai; Hofmann, Martin R; Ludwig, Alfred; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios

    2011-01-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  8. High-throughput characterization of film thickness in thin film materials libraries by digital holographic microscopy.

    Science.gov (United States)

    Lai, Yiu Wai; Krause, Michael; Savan, Alan; Thienhaus, Sigurd; Koukourakis, Nektarios; Hofmann, Martin R; Ludwig, Alfred

    2011-10-01

    A high-throughput characterization technique based on digital holography for mapping film thickness in thin-film materials libraries was developed. Digital holographic microscopy is used for fully automatic measurements of the thickness of patterned films with nanometer resolution. The method has several significant advantages over conventional stylus profilometry: it is contactless and fast, substrate bending is compensated, and the experimental setup is simple. Patterned films prepared by different combinatorial thin-film approaches were characterized to investigate and demonstrate this method. The results show that this technique is valuable for the quick, reliable and high-throughput determination of the film thickness distribution in combinatorial materials research. Importantly, it can also be applied to thin films that have been structured by shadow masking.

  9. Gold-coated polydimethylsiloxane microwells for high-throughput electrochemiluminescence analysis of intracellular glucose at single cells.

    Science.gov (United States)

    Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng

    2018-06-04

    In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.

  10. High throughput electrophysiology: new perspectives for ion channel drug discovery

    DEFF Research Database (Denmark)

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels....... A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct...... characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion...

  11. Proposals of new basic concepts on safety and radioactive waste and of new High Temperature Gas-cooled Reactor based on these basic concepts

    Energy Technology Data Exchange (ETDEWEB)

    Ogawa, Masuro, E-mail: ogawa.masuro@jaea.go.jp

    2016-11-15

    Highlights: • The author proposed new basic concepts on safety and radioactive waste. • A principle of ‘continue confining’ to realize the basic concept on safety is also proposed. • It is indicated that only a HTGR can attain the conditions required from the principle. • Technologies to realize the basic concept on radioactive waste are also discussed. • A New HTGR system based on the new basic concepts is proposed. - Abstract: A new basic concept on safety of ‘Not causing any serious catastrophe by any means’ and a new basic concept on radioactive waste of ‘Not returning any waste that possibly affects the environment’ are proposed in the present study, aiming at nuclear power plants which everybody can accept, in consideration of the serious catastrophe that happened at Fukushima Japan in 2011. These new basic concepts can be found to be valid in comparison with basic concepts on safety and waste in other industries. The principle to realize the new basic concept on safety is, as known well as the inherent safety, to use physical phenomena such as Doppler Effect and so on which never fail to work even if all equipment and facilities for safety lose their functions. In the present study, physical phenomena are used to ‘continue confining’, rather than ‘confine’, because the consequence of emission of radioactive substances to the environment cannot be mitigated. To ‘continue confining’ is meant to apply natural correction to fulfill inherent safety function. Fission products must be detoxified to realize the new basic concept on radioactive waste, aiming at the final processing and disposal of radioactive wastes as same as that in the other wastes such as PCB, together with much efforts not to produce radioactive wastes and to reduce their volume nevertheless if they are emitted. Technology development on the detoxification is one of the most important subjects. A new High Temperature Gas-cooled Reactor, namely the New HTGR

  12. Proposals of new basic concepts on safety and radioactive waste and of new High Temperature Gas-cooled Reactor based on these basic concepts

    International Nuclear Information System (INIS)

    Ogawa, Masuro

    2016-01-01

    Highlights: • The author proposed new basic concepts on safety and radioactive waste. • A principle of ‘continue confining’ to realize the basic concept on safety is also proposed. • It is indicated that only a HTGR can attain the conditions required from the principle. • Technologies to realize the basic concept on radioactive waste are also discussed. • A New HTGR system based on the new basic concepts is proposed. - Abstract: A new basic concept on safety of ‘Not causing any serious catastrophe by any means’ and a new basic concept on radioactive waste of ‘Not returning any waste that possibly affects the environment’ are proposed in the present study, aiming at nuclear power plants which everybody can accept, in consideration of the serious catastrophe that happened at Fukushima Japan in 2011. These new basic concepts can be found to be valid in comparison with basic concepts on safety and waste in other industries. The principle to realize the new basic concept on safety is, as known well as the inherent safety, to use physical phenomena such as Doppler Effect and so on which never fail to work even if all equipment and facilities for safety lose their functions. In the present study, physical phenomena are used to ‘continue confining’, rather than ‘confine’, because the consequence of emission of radioactive substances to the environment cannot be mitigated. To ‘continue confining’ is meant to apply natural correction to fulfill inherent safety function. Fission products must be detoxified to realize the new basic concept on radioactive waste, aiming at the final processing and disposal of radioactive wastes as same as that in the other wastes such as PCB, together with much efforts not to produce radioactive wastes and to reduce their volume nevertheless if they are emitted. Technology development on the detoxification is one of the most important subjects. A new High Temperature Gas-cooled Reactor, namely the New HTGR

  13. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    OpenAIRE

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-01-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes a...

  14. A recombinant fusion protein-based, fluorescent protease assay for high throughput-compatible substrate screening.

    Science.gov (United States)

    Bozóki, Beáta; Gazda, Lívia; Tóth, Ferenc; Miczi, Márió; Mótyán, János András; Tőzsér, József

    2018-01-01

    In connection with the intensive investigation of proteases, several methods have been developed for analysis of the substrate specificity. Due to the great number of proteases and the expected target molecules to be analyzed, time- and cost-efficient high-throughput screening (HTS) methods are preferred. Here we describe the development and application of a separation-based HTS-compatible fluorescent protease assay, which is based on the use of recombinant fusion proteins as substrates of proteases. The protein substrates used in this assay consists of N-terminal (hexahistidine and maltose binding protein) fusion tags, cleavage sequences of the tobacco etch virus (TEV) and HIV-1 proteases, and a C-terminal fluorescent protein (mApple or mTurquoise2). The assay is based on the fluorimetric detection of the fluorescent proteins, which are released from the magnetic bead-attached substrates by the proteolytic cleavage. The protease assay has been applied for activity measurements of TEV and HIV-1 proteases to test the suitability of the system for enzyme kinetic measurements, inhibition studies, and determination of pH optimum. We also found that denatured fluorescent proteins can be renatured after SDS-PAGE of denaturing conditions, but showed differences in their renaturation abilities. After in-gel renaturation both substrates and cleavage products can be identified by in-gel UV detection. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. High throughput screening of starch structures using carbohydrate microarrays

    DEFF Research Database (Denmark)

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated...

  16. Achieving high data throughput in research networks

    International Nuclear Information System (INIS)

    Matthews, W.; Cottrell, L.

    2001-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155 Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  17. Achieving High Data Throughput in Research Networks

    International Nuclear Information System (INIS)

    Matthews, W

    2004-01-01

    After less than a year of operation, the BaBar experiment at SLAC has collected almost 100 million particle collision events in a database approaching 165TB. Around 20 TB of data has been exported via the Internet to the BaBar regional center at IN2P3 in Lyon, France, and around 40TB of simulated data has been imported from the Lawrence Livermore National Laboratory (LLNL). BaBar collaborators plan to double data collection each year and export a third of the data to IN2P3. So within a few years the SLAC OC3 (155Mbps) connection will be fully utilized by file transfer to France alone. Upgrades to infrastructure is essential and detailed understanding of performance issues and the requirements for reliable high throughput transfers is critical. In this talk results from active and passive monitoring and direct measurements of throughput will be reviewed. Methods for achieving the ambitious requirements will be discussed

  18. High-Throughput Investigation of a Lead-Free AlN-Based Piezoelectric Material, (Mg,Hf)xAl1-xN.

    Science.gov (United States)

    Nguyen, Hung H; Oguchi, Hiroyuki; Van Minh, Le; Kuwano, Hiroki

    2017-06-12

    We conducted a high-throughput investigation of the fundamental properties of (Mg,Hf) x Al 1-x N thin films (0 piezoelectric materials. For the high-throughput investigation, we prepared composition-gradient (Mg,Hf) x Al 1-x N films grown on a Si(100) substrate at 600 °C by cosputtering AlN and MgHf targets. To measure the properties of the various compositions at different positions within a single sample, we used characterization techniques with spatial resolution. X-ray diffraction (XRD) with a beam spot diameter of 1.0 mm verified that Mg and Hf had substituted into the Al sites and caused an elongation of the c-axis of AlN from 5.00 Å for x = 0 to 5.11 Å for x = 0.24. In addition, the uniaxial crystal orientation and high crystallinity required for piezoelectric materials to be used as application devices were confirmed. The piezoelectric response microscope indicated that this c-axis elongation increased the piezoelectric coefficient almost linearly from 1.48 pm/V for x = 0 to 5.19 pm/V for x = 0.24. The dielectric constants of (Mg,Hf) x Al 1-x N were investigated using parallel plate capacitor structures with ∼0.07 mm 2 electrodes and showed a slight increase by substitution. These results verified that (Mg,Hf) x Al 1-x N is a promising material for piezoelectric-based application devices, especially for vibrational energy harvesters.

  19. High-Throughput Study of Diffusion and Phase Transformation Kinetics of Magnesium-Based Systems for Automotive Cast Magnesium Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Alan A [The Ohio State Univ., Columbus, OH (United States); Zhao, Ji-Cheng [The Ohio State Univ., Columbus, OH (United States); Riggi, Adrienne [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Joost, William [US Dept. of Energy, Washington, DC (United States)

    2017-10-02

    The objective of the proposed study is to establish a scientific foundation on kinetic modeling of diffusion, phase precipitation, and casting/solidification, in order to accelerate the design and optimization of cast magnesium (Mg) alloys for weight reduction of U.S. automotive fleet. The team has performed the following tasks: 1) study diffusion kinetics of various Mg-containing binary systems using high-throughput diffusion multiples to establish reliable diffusivity and mobility databases for the Mg-aluminum (Al)-zinc (Zn)-tin (Sn)-calcium (Ca)-strontium (Sr)-manganese (Mn) systems; 2) study the precipitation kinetics (nucleation, growth and coarsening) using both innovative dual-anneal diffusion multiples and cast model alloys to provide large amounts of kinetic data (including interfacial energy) and microstructure atlases to enable implementation of the Kampmann-Wagner numerical model to simulate phase transformation kinetics of non-spherical/non-cuboidal precipitates in Mg alloys; 3) implement a micromodel to take into account back diffusion in the solid phase in order to predict microstructure and microsegregation in multicomponent Mg alloys during dendritic solidification especially under high pressure die-casting (HPDC) conditions; and, 4) widely disseminate the data, knowledge and information using the Materials Genome Initiative infrastructure (http://www.mgidata.org) as well as publications and digital data sharing to enable researchers to identify new pathways/routes to better cast Mg alloys.

  20. Incorporation of Savannah River Plant radioactive waste into concrete

    International Nuclear Information System (INIS)

    Stone, J.A.

    1975-01-01

    Results are reported of a laboratory-scale experimental program at the Savannah River Laboratory to gain information on the fixation of high-level radioactive wastes in concrete. Two concrete formulations, a High-Alumina Cement and a Portland Pozzalanic cement, were selected on the bases of leachability and compressive strength for the fixation of non-radioactive simulated wastes. Therefore, these two cements were selected for current studies for the fixation of actual Savannah River Plant high-level wastes. (U.S.)

  1. Estimating Margin of Exposure to Thyroid Peroxidase Inhibitors Using High-Throughput in vitro Data, High-Throughput Exposure Modeling, and Physiologically Based Pharmacokinetic/Pharmacodynamic Modeling

    Science.gov (United States)

    Leonard, Jeremy A.; Tan, Yu-Mei; Gilbert, Mary; Isaacs, Kristin; El-Masri, Hisham

    2016-01-01

    Some pharmaceuticals and environmental chemicals bind the thyroid peroxidase (TPO) enzyme and disrupt thyroid hormone production. The potential for TPO inhibition is a function of both the binding affinity and concentration of the chemical within the thyroid gland. The former can be determined through in vitro assays, and the latter is influenced by pharmacokinetic properties, along with environmental exposure levels. In this study, a physiologically based pharmacokinetic (PBPK) model was integrated with a pharmacodynamic (PD) model to establish internal doses capable of inhibiting TPO in relation to external exposure levels predicted through exposure modeling. The PBPK/PD model was evaluated using published serum or thyroid gland chemical concentrations or circulating thyroxine (T4) and triiodothyronine (T3) hormone levels measured in rats and humans. After evaluation, the model was used to estimate human equivalent intake doses resulting in reduction of T4 and T3 levels by 10% (ED10) for 6 chemicals of varying TPO-inhibiting potencies. These chemicals were methimazole, 6-propylthiouracil, resorcinol, benzophenone-2, 2-mercaptobenzothiazole, and triclosan. Margin of exposure values were estimated for these chemicals using the ED10 and predicted population exposure levels for females of child-bearing age. The modeling approach presented here revealed that examining hazard or exposure alone when prioritizing chemicals for risk assessment may be insufficient, and that consideration of pharmacokinetic properties is warranted. This approach also provides a mechanism for integrating in vitro data, pharmacokinetic properties, and exposure levels predicted through high-throughput means when interpreting adverse outcome pathways based on biological responses. PMID:26865668

  2. High-throughput screening in niche-based assay identifies compounds to target preleukemic stem cells

    Science.gov (United States)

    Gerby, Bastien; Veiga, Diogo F.T.; Krosl, Jana; Nourreddine, Sami; Ouellette, Julianne; Haman, André; Lavoie, Geneviève; Fares, Iman; Tremblay, Mathieu; Litalien, Véronique; Ottoni, Elizabeth; Geoffrion, Dominique; Maddox, Paul S.; Chagraoui, Jalila; Hébert, Josée; Sauvageau, Guy; Kwok, Benjamin H.; Roux, Philippe P.

    2016-01-01

    Current chemotherapies for T cell acute lymphoblastic leukemia (T-ALL) efficiently reduce tumor mass. Nonetheless, disease relapse attributed to survival of preleukemic stem cells (pre-LSCs) is associated with poor prognosis. Herein, we provide direct evidence that pre-LSCs are much less chemosensitive to existing chemotherapy drugs than leukemic blasts because of a distinctive lower proliferative state. Improving therapies for T-ALL requires the development of strategies to target pre-LSCs that are absolutely dependent on their microenvironment. Therefore, we designed a robust protocol for high-throughput screening of compounds that target primary pre-LSCs maintained in a niche-like environment, on stromal cells that were engineered for optimal NOTCH1 activation. The multiparametric readout takes into account the intrinsic complexity of primary cells in order to specifically monitor pre-LSCs, which were induced here by the SCL/TAL1 and LMO1 oncogenes. We screened a targeted library of compounds and determined that the estrogen derivative 2-methoxyestradiol (2-ME2) disrupted both cell-autonomous and non–cell-autonomous pathways. Specifically, 2-ME2 abrogated pre-LSC viability and self-renewal activity in vivo by inhibiting translation of MYC, a downstream effector of NOTCH1, and preventing SCL/TAL1 activity. In contrast, normal hematopoietic stem/progenitor cells remained functional. These results illustrate how recapitulating tissue-like properties of primary cells in high-throughput screening is a promising avenue for innovation in cancer chemotherapy. PMID:27797342

  3. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  4. The Stanford Automated Mounter: Enabling High-Throughput Protein Crystal Screening at SSRL

    International Nuclear Information System (INIS)

    Smith, C.A.; Cohen, A.E.

    2009-01-01

    The macromolecular crystallography experiment lends itself perfectly to high-throughput technologies. The initial steps including the expression, purification, and crystallization of protein crystals, along with some of the later steps involving data processing and structure determination have all been automated to the point where some of the last remaining bottlenecks in the process have been crystal mounting, crystal screening, and data collection. At the Stanford Synchrotron Radiation Laboratory, a National User Facility that provides extremely brilliant X-ray photon beams for use in materials science, environmental science, and structural biology research, the incorporation of advanced robotics has enabled crystals to be screened in a true high-throughput fashion, thus dramatically accelerating the final steps. Up to 288 frozen crystals can be mounted by the beamline robot (the Stanford Auto-Mounting System) and screened for diffraction quality in a matter of hours without intervention. The best quality crystals can then be remounted for the collection of complete X-ray diffraction data sets. Furthermore, the entire screening and data collection experiment can be controlled from the experimenter's home laboratory by means of advanced software tools that enable network-based control of the highly automated beamlines.

  5. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    Science.gov (United States)

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  6. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  7. A high-throughput colorimetric screening assay for terpene synthase activity based on substrate consumption.

    Directory of Open Access Journals (Sweden)

    Maiko Furubayashi

    Full Text Available Terpene synthases catalyze the formation of a variety of terpene chemical structures. Systematic mutagenesis studies have been effective in providing insights into the characteristic and complex mechanisms of C-C bond formations and in exploring the enzymatic potential for inventing new chemical structures. In addition, there is growing demand to increase terpene synthase activity in heterologous hosts, given the maturation of metabolic engineering and host breeding for terpenoid synthesis. We have developed a simple screening method for the cellular activities of terpene synthases by scoring their substrate consumption based on the color loss of the cell harboring carotenoid pathways. We demonstrate that this method can be used to detect activities of various terpene synthase or prenyltransferase genes in a high-throughput manner, irrespective of the product type, enabling the mutation analysis and directed evolution of terpene synthases. We also report the possibility for substrate-specific screening system of terpene synthases by taking advantage of the substrate-size specificity of C30 and C40 carotenoid pathways.

  8. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    Science.gov (United States)

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  9. High-throughput combinatorial chemical bath deposition: The case of doping Cu (In, Ga) Se film with antimony

    Science.gov (United States)

    Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong

    2018-01-01

    The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.

  10. In-Field High-Throughput Phenotyping of Cotton Plant Height Using LiDAR

    OpenAIRE

    Shangpeng Sun; Changying Li; Andrew H. Paterson

    2017-01-01

    A LiDAR-based high-throughput phenotyping (HTP) system was developed for cotton plant phenotyping in the field. The HTP system consists of a 2D LiDAR and an RTK-GPS mounted on a high clearance tractor. The LiDAR scanned three rows of cotton plots simultaneously from the top and the RTK-GPS was used to provide the spatial coordinates of the point cloud during data collection. Configuration parameters of the system were optimized to ensure the best data quality. A height profile for each plot w...

  11. A FRET-based high throughput screening assay to identify inhibitors of anthrax protective antigen binding to capillary morphogenesis gene 2 protein.

    Directory of Open Access Journals (Sweden)

    Michael S Rogers

    Full Text Available Anti-angiogenic therapies are effective for the treatment of cancer, a variety of ocular diseases, and have potential benefits in cardiovascular disease, arthritis, and psoriasis. We have previously shown that anthrax protective antigen (PA, a non-pathogenic component of anthrax toxin, is an inhibitor of angiogenesis, apparently as a result of interaction with the cell surface receptors capillary morphogenesis gene 2 (CMG2 protein and tumor endothelial marker 8 (TEM8. Hence, molecules that bind the anthrax toxin receptors may be effective to slow or halt pathological vascular growth. Here we describe development and testing of an effective homogeneous steady-state fluorescence resonance energy transfer (FRET high throughput screening assay designed to identify molecules that inhibit binding of PA to CMG2. Molecules identified in the screen can serve as potential lead compounds for the development of anti-angiogenic and anti-anthrax therapies. The assay to screen for inhibitors of this protein-protein interaction is sensitive and robust, with observed Z' values as high as 0.92. Preliminary screens conducted with a library of known bioactive compounds identified tannic acid and cisplatin as inhibitors of the PA-CMG2 interaction. We have confirmed that tannic acid both binds CMG2 and has anti-endothelial properties. In contrast, cisplatin appears to inhibit PA-CMG2 interaction by binding both PA and CMG2, and observed cisplatin anti-angiogenic effects are not mediated by interaction with CMG2. This work represents the first reported high throughput screening assay targeting CMG2 to identify possible inhibitors of both angiogenesis and anthrax intoxication.

  12. Micropillar arrays as a high-throughput screening platform for therapeutics in multiple sclerosis.

    Science.gov (United States)

    Mei, Feng; Fancy, Stephen P J; Shen, Yun-An A; Niu, Jianqin; Zhao, Chao; Presley, Bryan; Miao, Edna; Lee, Seonok; Mayoral, Sonia R; Redmond, Stephanie A; Etxeberria, Ainhoa; Xiao, Lan; Franklin, Robin J M; Green, Ari; Hauser, Stephen L; Chan, Jonah R

    2014-08-01

    Functional screening for compounds that promote remyelination represents a major hurdle in the development of rational therapeutics for multiple sclerosis. Screening for remyelination is problematic, as myelination requires the presence of axons. Standard methods do not resolve cell-autonomous effects and are not suited for high-throughput formats. Here we describe a binary indicant for myelination using micropillar arrays (BIMA). Engineered with conical dimensions, micropillars permit resolution of the extent and length of membrane wrapping from a single two-dimensional image. Confocal imaging acquired from the base to the tip of the pillars allows for detection of concentric wrapping observed as 'rings' of myelin. The platform is formatted in 96-well plates, amenable to semiautomated random acquisition and automated detection and quantification. Upon screening 1,000 bioactive molecules, we identified a cluster of antimuscarinic compounds that enhance oligodendrocyte differentiation and remyelination. Our findings demonstrate a new high-throughput screening platform for potential regenerative therapeutics in multiple sclerosis.

  13. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  14. A hybrid MAC protocol design for energy-efficient very-high-throughput millimeter wave, wireless sensor communication networks

    Science.gov (United States)

    Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung

    2010-12-01

    This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.

  15. A high-throughput method for GMO multi-detection using a microfluidic dynamic array.

    Science.gov (United States)

    Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J

    2014-02-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.

  16. Management of radioactive wastes from non-power applications. The Cuban experience

    International Nuclear Information System (INIS)

    Benitez, J.C.; Salgado, M.; Jova, L.

    2001-01-01

    Full text: Origin of Radioactive Wastes. The wastes arisen from the applications of radioisotopes in medicine are mainly liquids and solid materials contaminated with short lived radionuclides and sealed sources used in radiotherapy and for sterilization of medical materials. Radioactive wastes from industrial applications are generally disused sealed sources used in level detection, quality control, smoke detection and non-destructive testing. The principal forms of wastes generated by research institutes are miscellaneous liquids, trash, biological wastes, and scintillation vials, sealed sources and targets. Solid radioactive wastes are mainly produced during research works, cleaning and decontamination activities and they consist of rags, paper, cellulose, plastics, gloves, clothing, overshoes, etc. Laboratory materials such as cans, polyethylene bags and glass bottles also contribute to the solid waste inventory. Small quantities of non-compactable wastes are also collected and received for treatment. They include wood pieces, metal scrap, defective components and tools. Radioactive Waste Management Policy and Infrastructure. Since 1994 the Cuban integral policy of nuclear development is entrusted to the Nuclear Energy Agency of the Ministry of Science, Technology and Environment (CITMA). The National Center for Nuclear Safety (CNSN) is responsible for the licensing and supervision of radioactive and nuclear installations. The CPHR is in charge of waste management policy and therefore is responsible for centralized collection, transportation, treatment, conditioning, long term storage, and disposal of radioactive waste, as well as for developing new waste conditioning and containment methods. Radioactive Waste Management Facilities. Waste Treatment and Conditioning Plant (WTCP). The present facility is a building that includes a technological area of 100 m 2 and a laboratory area with a surface of around 30 m 2 . Other areas to be distinguished inside the

  17. AELAS: Automatic ELAStic property derivations via high-throughput first-principles computation

    Science.gov (United States)

    Zhang, S. H.; Zhang, R. F.

    2017-11-01

    The elastic properties are fundamental and important for crystalline materials as they relate to other mechanical properties, various thermodynamic qualities as well as some critical physical properties. However, a complete set of experimentally determined elastic properties is only available for a small subset of known materials, and an automatic scheme for the derivations of elastic properties that is adapted to high-throughput computation is much demanding. In this paper, we present the AELAS code, an automated program for calculating second-order elastic constants of both two-dimensional and three-dimensional single crystal materials with any symmetry, which is designed mainly for high-throughput first-principles computation. Other derivations of general elastic properties such as Young's, bulk and shear moduli as well as Poisson's ratio of polycrystal materials, Pugh ratio, Cauchy pressure, elastic anisotropy and elastic stability criterion, are also implemented in this code. The implementation of the code has been critically validated by a lot of evaluations and tests on a broad class of materials including two-dimensional and three-dimensional materials, providing its efficiency and capability for high-throughput screening of specific materials with targeted mechanical properties. Program Files doi:http://dx.doi.org/10.17632/f8fwg4j9tw.1 Licensing provisions: BSD 3-Clause Programming language: Fortran Nature of problem: To automate the calculations of second-order elastic constants and the derivations of other elastic properties for two-dimensional and three-dimensional materials with any symmetry via high-throughput first-principles computation. Solution method: The space-group number is firstly determined by the SPGLIB code [1] and the structure is then redefined to unit cell with IEEE-format [2]. Secondly, based on the determined space group number, a set of distortion modes is automatically specified and the distorted structure files are generated

  18. HIV-1 entry inhibition by small-molecule CCR5 antagonists: A combined molecular modeling and mutant study using a high-throughput assay

    International Nuclear Information System (INIS)

    Labrecque, Jean; Metz, Markus; Lau, Gloria; Darkes, Marilyn C.; Wong, Rebecca S.Y.; Bogucki, David; Carpenter, Bryon; Chen Gang; Li Tongshuang; Nan, Susan; Schols, Dominique; Bridger, Gary J.; Fricker, Simon P.; Skerlj, Renato T.

    2011-01-01

    Based on the attrition rate of CCR5 small molecule antagonists in the clinic the discovery and development of next generation antagonists with an improved pharmacology and safety profile is necessary. Herein, we describe a combined molecular modeling, CCR5-mediated cell fusion, and receptor site-directed mutagenesis approach to study the molecular interactions of six structurally diverse compounds (aplaviroc, maraviroc, vicriviroc, TAK-779, SCH-C and a benzyloxycarbonyl-aminopiperidin-1-yl-butane derivative) with CCR5, a coreceptor for CCR5-tropic HIV-1 strains. This is the first study using an antifusogenic assay, a model of the interaction of the gp120 envelope protein with CCR5. This assay avoids the use of radioactivity and HIV infection assays, and can be used in a high throughput mode. The assay was validated by comparison with other established CCR5 assays. Given the hydrophobic nature of the binding pocket several binding models are suggested which could prove useful in the rational drug design of new lead compounds.

  19. An in silico high-throughput screen identifies potential selective inhibitors for the non-receptor tyrosine kinase Pyk2

    Directory of Open Access Journals (Sweden)

    Meirson T

    2017-05-01

    Full Text Available Tomer Meirson, Abraham O Samson, Hava Gil-Henn Faculty of Medicine in the Galilee, Bar-Ilan University, Safed, Israel Abstract: The non-receptor tyrosine kinase proline-rich tyrosine kinase 2 (Pyk2 is a critical mediator of signaling from cell surface growth factor and adhesion receptors to cell migration, proliferation, and survival. Emerging evidence indicates that signaling by Pyk2 regulates hematopoietic cell response, bone density, neuronal degeneration, angiogenesis, and cancer. These physiological and pathological roles of Pyk2 warrant it as a valuable therapeutic target for invasive cancers, osteoporosis, Alzheimer’s disease, and inflammatory cellular response. Despite its potential as a therapeutic target, no potent and selective inhibitor of Pyk2 is available at present. As a first step toward discovering specific potential inhibitors of Pyk2, we used an in silico high-throughput screening approach. A virtual library of six million lead-like compounds was docked against four different high-resolution Pyk2 kinase domain crystal structures and further selected for predicted potency and ligand efficiency. Ligand selectivity for Pyk2 over focal adhesion kinase (FAK was evaluated by comparative docking of ligands and measurement of binding free energy so as to obtain 40 potential candidates. Finally, the structural flexibility of a subset of the docking complexes was evaluated by molecular dynamics simulation, followed by intermolecular interaction analysis. These compounds may be considered as promising leads for further development of highly selective Pyk2 inhibitors. Keywords: virtual screen, efficiency metrics, MM-GBSA, molecular dynamics

  20. Use of flow cytometry for high-throughput cell population estimates in fixed brain tissue

    Directory of Open Access Journals (Sweden)

    Nicole A Young

    2012-07-01

    Full Text Available The numbers and types of cells in an area of cortex define its function. Therefore it is essential to characterize the numbers and distributions of total cells in areas of the cortex, as well as to identify numbers of subclasses of neurons and glial cells. To date, the large size of the primate brain and the lack of innovation in cell counting methods have been a roadblock to obtaining high-resolution maps of cell and neuron density across the cortex in humans and non-human primates. Stereological counting methods and the isotropic fractionator are valuable tools for estimating cell numbers, but are better suited to smaller, well-defined brain structures or to cortex as a whole. In the present study, we have extended our flow-cytometry based counting method, the flow fractionator (Collins et al., 2010a, to include high-throughput total cell population estimates in homogenized cortical samples. We demonstrate that our method produces consistent, accurate and repeatable cell estimates quickly. The estimates we report are in excellent agreement with estimates for the same samples obtained using a Neubauer chamber and a fluorescence microscope. We show that our flow cytometry-based method for total cell estimation in homogenized brain tissue is more efficient and more precise than manual counting methods. The addition of automated nuclei counting to our flow fractionator method allows for a fully automated, rapid characterization of total cells and neuronal and non-neuronal populations in human and non-human primate brains, providing valuable data to further our understanding of the functional organization of normal, aging and diseased brains.

  1. High-throughput search for caloric materials: the CaloriCool approach

    Science.gov (United States)

    Zarkevich, N. A.; Johnson, D. D.; Pecharsky, V. K.

    2018-01-01

    The high-throughput search paradigm adopted by the newly established caloric materials consortium—CaloriCool®—with the goal to substantially accelerate discovery and design of novel caloric materials is briefly discussed. We begin with describing material selection criteria based on known properties, which are then followed by heuristic fast estimates, ab initio calculations, all of which has been implemented in a set of automated computational tools and measurements. We also demonstrate how theoretical and computational methods serve as a guide for experimental efforts by considering a representative example from the field of magnetocaloric materials.

  2. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays

    KAUST Repository

    Sturino, Joseph

    2010-01-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  3. Identification of novel KCNQ4 openers by a high-throughput fluorescence-based thallium flux assay.

    Science.gov (United States)

    Li, Qunyi; Rottländer, Mario; Xu, Mingkai; Christoffersen, Claus Tornby; Frederiksen, Kristen; Wang, Ming-Wei; Jensen, Henrik Sindal

    2011-11-01

    To develop a real-time thallium flux assay for high-throughput screening (HTS) of human KCNQ4 (Kv7.4) potassium channel openers, we used CHO-K1 cells stably expressing human KCNQ4 channel protein and a thallium-sensitive dye based on the permeability of thallium through potassium channels. The electrophysiological and pharmacological properties of the cell line expressing the KCNQ4 protein were found to be in agreement with that reported elsewhere. The EC(50) values of the positive control compound (retigabine) determined by the thallium and (86)rubidium flux assays were comparable to and consistent with those documented in the literature. Signal-to-background (S/B) ratio and Z factor of the thallium influx assay system were assessed to be 8.82 and 0.63, respectively. In a large-scale screening of 98,960 synthetic and natural compounds using the thallium influx assay, 76 compounds displayed consistent KCNQ4 activation, and of these 6 compounds demonstrated EC(50) values of less than 20 μmol/L and 2 demonstrated EC(50) values of less than 1 μmol/L. Taken together, the fluorescence-based thallium flux assay is a highly efficient, automatable, and robust tool to screen potential KCNQ4 openers. This approach may also be expanded to identify and evaluate potential modulators of other potassium channels. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Cytotoxicity Test Based on Human Cells Labeled with Fluorescent Proteins: Fluorimetry, Photography, and Scanning for High-Throughput Assay.

    Science.gov (United States)

    Kalinina, Marina A; Skvortsov, Dmitry A; Rubtsova, Maria P; Komarova, Ekaterina S; Dontsova, Olga A

    2018-06-01

    High- and medium-throughput assays are now routine methods for drug screening and toxicology investigations on mammalian cells. However, a simple and cost-effective analysis of cytotoxicity that can be carried out with commonly used laboratory equipment is still required. The developed cytotoxicity assays are based on human cell lines stably expressing eGFP, tdTomato, mCherry, or Katushka2S fluorescent proteins. Red fluorescent proteins exhibit a higher signal-to-noise ratio, due to less interference by medium autofluorescence, in comparison to green fluorescent protein. Measurements have been performed on a fluorescence scanner, a plate fluorimeter, and a camera photodocumentation system. For a 96-well plate assay, the sensitivity per well and the measurement duration were 250 cells and 15 min for the scanner, 500 cells and 2 min for the plate fluorimeter, and 1000 cells and less than 1 min for the camera detection. These sensitivities are similar to commonly used MTT (tetrazolium dye) assays. The used scanner and the camera had not been previously applied for cytotoxicity evaluation. An image processing scheme for the high-resolution scanner is proposed that significantly diminishes the number of control wells, even for a library containing fluorescent substances. The suggested cytotoxicity assay has been verified by measurements of the cytotoxicity of several well-known cytotoxic drugs and further applied to test a set of novel bacteriotoxic compounds in a medium-throughput format. The fluorescent signal of living cells is detected without disturbing them and adding any reagents, thus allowing to investigate time-dependent cytotoxicity effects on the same sample of cells. A fast, simple and cost-effective assay is suggested for cytotoxicity evaluation based on mammalian cells expressing fluorescent proteins and commonly used laboratory equipment.

  5. High-Throughput Screening of the Asymmetric Decarboxylative Alkylation Reaction of Enolate-Stabilized Enol Carbonates

    KAUST Repository

    Stoltz, Brian

    2010-06-14

    The use of high-throughput screening allowed for the optimization of reaction conditions for the palladium-catalyzed asymmetric decarboxylative alkylation reaction of enolate-stabilized enol carbonates. Changing to a non-polar reaction solvent and to an electron-deficient PHOX derivative as ligand from our standard reaction conditions improved the enantioselectivity for the alkylation of a ketal-protected,1,3-diketone-derived enol carbonate from 28% ee to 84% ee. Similar improvements in enantioselectivity were seen for a β-keto-ester derived- and an α-phenyl cyclohexanone-derived enol carbonate.

  6. High-Throughput Screening of the Asymmetric Decarboxylative Alkylation Reaction of Enolate-Stabilized Enol Carbonates

    KAUST Repository

    Stoltz, Brian; McDougal, Nolan; Virgil, Scott

    2010-01-01

    The use of high-throughput screening allowed for the optimization of reaction conditions for the palladium-catalyzed asymmetric decarboxylative alkylation reaction of enolate-stabilized enol carbonates. Changing to a non-polar reaction solvent and to an electron-deficient PHOX derivative as ligand from our standard reaction conditions improved the enantioselectivity for the alkylation of a ketal-protected,1,3-diketone-derived enol carbonate from 28% ee to 84% ee. Similar improvements in enantioselectivity were seen for a β-keto-ester derived- and an α-phenyl cyclohexanone-derived enol carbonate.

  7. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  8. A cell-based high-throughput protocol to screen entry inhibitors of highly pathogenic viruses with Traditional Chinese Medicines.

    Science.gov (United States)

    Yang, Yong; Cheng, Han; Yan, Hui; Wang, Peng-Zhan; Rong, Rong; Zhang, Ying-Ying; Zhang, Cheng-Bo; Du, Rui-Kun; Rong, Li-Jun

    2017-05-01

    Emerging viruses such as Ebola virus (EBOV), Lassa virus (LASV), and avian influenza virus H5N1 (AIV) are global health concerns. Since there is very limited options (either vaccine or specific therapy) approved for humans against these viruses, there is an urgent need to develop prophylactic and therapeutic treatments. Previously we reported a high-throughput screening (HTS) protocol to identify entry inhibitors for three highly pathogenic viruses (EBOV, LASV, and AIV) using a human immunodeficiency virus-based pseudotyping platform which allows us to perform the screening in a BSL-2 facility. In this report, we have adopted this screening protocol to evaluate traditional Chinese Medicines (TCMs) in an effort to discover entry inhibitors against these viruses. Here we show that extracts of the following Chinese medicinal herbs exhibit potent anti-Ebola viral activities: Gardenia jasminoides Ellis, Citrus aurantium L., Viola yedoensis Makino, Prunella vulgaris L., Coix lacryma-jobi L. var. mayuen (Roman.) Stapf, Pinellia ternata (Thunb.) Breit., and Morus alba L. This study represents a proof-of-principle investigation supporting the suitability of this assay for rapid screening TCMs and identifying putative entry inhibitors for these viruses. J. Med. Virol. 89:908-916, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Throughput of a MIMO OFDM based WLAN system

    NARCIS (Netherlands)

    Schenk, T.C.W.; Dolmans, G.; Modonesi, I.

    2004-01-01

    In this paper, the system throughput of a wireless local-area-network (WLAN) based on multiple-input multipleoutput orthogonal frequency division multiplexing (MIMO OFDM) is studied. A broadband channel model is derived from indoor channel measurements. This model is used in simulations to evaluate

  10. Automation in Cytomics: A Modern RDBMS Based Platform for Image Analysis and Management in High-Throughput Screening Experiments

    NARCIS (Netherlands)

    E. Larios (Enrique); Y. Zhang (Ying); K. Yan (Kuan); Z. Di; S. LeDévédec (Sylvia); F.E. Groffen (Fabian); F.J. Verbeek

    2012-01-01

    textabstractIn cytomics bookkeeping of the data generated during lab experiments is crucial. The current approach in cytomics is to conduct High-Throughput Screening (HTS) experiments so that cells can be tested under many different experimental conditions. Given the large amount of different

  11. Cell-Based Reporter System for High-Throughput Screening of MicroRNA Pathway Inhibitors and Its Limitations

    Czech Academy of Sciences Publication Activity Database

    Bruštíková, Kateřina; Sedlák, David; Kubíková, Jana; Škuta, Ctibor; Šolcová, Kateřina; Malík, Radek; Bartůněk, Petr; Svoboda, Petr

    2018-01-01

    Roč. 9 (2018), č. článku 45. ISSN 1664-8021 R&D Projects: GA ČR GA13-29531S; GA MŠk LO1220; GA MŠk LM2015063; GA MŠk LM2011022 Institutional support: RVO:68378050 Keywords : miRNA * high-throughput screening * miR-30 * let-7 * Argonaute Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 3.789, year: 2016

  12. Fibonacci-based hardware post-processing for non-autonomous signum hyperchaotic system

    KAUST Repository

    Mansingka, Abhinav S.; Barakat, Mohamed L.; Zidan, Mohammed A.; Radwan, Ahmed Gomaa; Salama, Khaled N.

    2013-01-01

    This paper presents a hardware implementation of a robust non-autonomous hyperchaotic-based PRNG driven by a 256-bit LFSR. The original chaotic output is post-processed using a novel technique based on the Fibonacci series, bitwise XOR, rotation, and feedback. The proposed post-processing technique preserves the throughput of the system and enhances the randomness in the output which is verified by successfully passing all NIST SP. 800-22 tests. The system is realized on a Xilinx Virtex 4 FPGA achieving throughput up to 13.165 Gbits/s for 16-bit bus-width surpassing previously reported CB-PRNGs. © 2013 IEEE.

  13. Fibonacci-based hardware post-processing for non-autonomous signum hyperchaotic system

    KAUST Repository

    Mansingka, Abhinav S.

    2013-12-01

    This paper presents a hardware implementation of a robust non-autonomous hyperchaotic-based PRNG driven by a 256-bit LFSR. The original chaotic output is post-processed using a novel technique based on the Fibonacci series, bitwise XOR, rotation, and feedback. The proposed post-processing technique preserves the throughput of the system and enhances the randomness in the output which is verified by successfully passing all NIST SP. 800-22 tests. The system is realized on a Xilinx Virtex 4 FPGA achieving throughput up to 13.165 Gbits/s for 16-bit bus-width surpassing previously reported CB-PRNGs. © 2013 IEEE.

  14. A high content, high throughput cellular thermal stability assay for measuring drug-target engagement in living cells.

    Science.gov (United States)

    Massey, Andrew J

    2018-01-01

    Determining and understanding drug target engagement is critical for drug discovery. This can be challenging within living cells as selective readouts are often unavailable. Here we describe a novel method for measuring target engagement in living cells based on the principle of altered protein thermal stabilization / destabilization in response to ligand binding. This assay (HCIF-CETSA) utilizes high content, high throughput single cell immunofluorescent detection to determine target protein levels following heating of adherent cells in a 96 well plate format. We have used target engagement of Chk1 by potent small molecule inhibitors to validate the assay. Target engagement measured by this method was subsequently compared to target engagement measured by two alternative methods (autophosphorylation and CETSA). The HCIF-CETSA method appeared robust and a good correlation in target engagement measured by this method and CETSA for the selective Chk1 inhibitor V158411 was observed. However, these EC50 values were 23- and 12-fold greater than the autophosphorylation IC50. The described method is therefore a valuable advance in the CETSA method allowing the high throughput determination of target engagement in adherent cells.

  15. Radioactive waste from non-power applications in Sweden

    International Nuclear Information System (INIS)

    Haegg, Ann-Christin; Lindbom, Gunilla; Persson, Monica

    2001-01-01

    regulations enable the free release of small amounts of radioactive waste either to the municipal sewage system or for delivering to a municipal dumpsite. Identified issues. It is not possible for the SSI to conduct more than a limited number of inspections. SSI relies on the licensee to inform the SSI when the source is no longer in use. An incitement for this is the annual fee mentioned above. Sources with activity below 500 megaBq from facilities with a summary licence are not accounted for separately and can therefore be difficult to control. The only radioactive waste facility (recognised waste facility) with the capacity and the authorisation for taking care of disused radioactive sources and other forms of radioactive waste from Non-Power applications is Studsvik AB. The future costs for final disposal of this waste is unclear because of the lack of final repository. Studsvik has to make sure that future costs are covered by the fee they charges for taking care of radioactive waste. As the only recognised waste facility Studsvik can freely set the fee for taking care of radioactive waste. If the fee is set too high there's a risk that waste from some unserious license-holder will be lost' or kept in storage. Studsvik has no formal responsibility for taking care of used radioactive sources. It's not unrealistic that Studsvik in the future decides not to accept a specific waste-form. Commercial products: Approximately there are 10 millions fireguards containing about 40 kBq Am-241 in Sweden. The average lifetime of the fireguards is 10 years and implicates that about one million fireguards are disposed of each year. SSI has issued regulations stating that private persons are allowed to occasionally throw a fireguard on municipal dump-sites. Companies are allowed to throw up to five fireguards each month. Identified issues: An assumption for the regulations was that the fireguards were not disposed at the same time nor at the same place. A dilution was anticipated

  16. Ontology-based meta-analysis of global collections of high-throughput public data.

    Directory of Open Access Journals (Sweden)

    Ilya Kupershmidt

    2010-09-01

    Full Text Available The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today.We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets.Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  17. Ontology-based meta-analysis of global collections of high-throughput public data.

    Science.gov (United States)

    Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa

    2010-09-29

    The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  18. A high throughput array microscope for the mechanical characterization of biomaterials

    Science.gov (United States)

    Cribb, Jeremy; Osborne, Lukas D.; Hsiao, Joe Ping-Lin; Vicci, Leandra; Meshram, Alok; O'Brien, E. Tim; Spero, Richard Chasen; Taylor, Russell; Superfine, Richard

    2015-02-01

    In the last decade, the emergence of high throughput screening has enabled the development of novel drug therapies and elucidated many complex cellular processes. Concurrently, the mechanobiology community has developed tools and methods to show that the dysregulation of biophysical properties and the biochemical mechanisms controlling those properties contribute significantly to many human diseases. Despite these advances, a complete understanding of the connection between biomechanics and disease will require advances in instrumentation that enable parallelized, high throughput assays capable of probing complex signaling pathways, studying biology in physiologically relevant conditions, and capturing specimen and mechanical heterogeneity. Traditional biophysical instruments are unable to meet this need. To address the challenge of large-scale, parallelized biophysical measurements, we have developed an automated array high-throughput microscope system that utilizes passive microbead diffusion to characterize mechanical properties of biomaterials. The instrument is capable of acquiring data on twelve-channels simultaneously, where each channel in the system can independently drive two-channel fluorescence imaging at up to 50 frames per second. We employ this system to measure the concentration-dependent apparent viscosity of hyaluronan, an essential polymer found in connective tissue and whose expression has been implicated in cancer progression.

  19. High-throughput bioinformatics with the Cyrille2 pipeline system.

    NARCIS (Netherlands)

    Fiers, M.W.E.J.; Burgt, van der A.; Datema, E.; Groot, de J.C.W.; Ham, van R.C.H.J.

    2008-01-01

    Background - Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses

  20. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  1. High-throughput screening for industrial enzyme production hosts by droplet microfluidics

    DEFF Research Database (Denmark)

    Sjostrom, Staffan L.; Bai, Yunpeng; Huang, Mingtao

    2014-01-01

    A high-throughput method for single cell screening by microfluidic droplet sorting is applied to a whole-genome mutated yeast cell library yielding improved production hosts of secreted industrial enzymes. The sorting method is validated by enriching a yeast strain 14 times based on its α......-amylase production, close to the theoretical maximum enrichment. Furthermore, a 105 member yeast cell library is screened yielding a clone with a more than 2-fold increase in α-amylase production. The increase in enzyme production results from an improvement of the cellular functions of the production host...

  2. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  3. High-throughput automated parallel evaluation of zinc-based catalysts for the copolymerization of CHO and CO2 to polycarbonates

    NARCIS (Netherlands)

    Meerendonk, van W.J.; Duchateau, R.; Koning, C.E.; Gruter, G.J.M.

    2004-01-01

    Copolymn. of CO2 and oxiranes using a high-pressure autoclave typically allows one expt. per reactor per day. A high-throughput parallel setup was developed and validated for the copolymn. of CO2 and cyclohxene oxide (CHO) with two b-diiminato zinc complexes. The catalyst activity is affected by

  4. High-throughput preparation and testing of ion-exchanged zeolites

    International Nuclear Information System (INIS)

    Janssen, K.P.F.; Paul, J.S.; Sels, B.F.; Jacobs, P.A.

    2007-01-01

    A high-throughput research platform was developed for the preparation and subsequent catalytic liquid-phase screening of ion-exchanged zeolites, for instance with regard to their use as heterogeneous catalysts. In this system aqueous solutions and other liquid as well as solid reagents are employed as starting materials and 24 samples are prepared on a library plate with a 4 x 6 layout. Volumetric dispensing of metal precursor solutions, weighing of zeolite and subsequent mixing/washing cycles of the starting materials and distributing reaction mixtures to the library plate are automatically performed by liquid and solid handlers controlled by a single common and easy-to-use programming software interface. The thus prepared materials are automatically contacted with reagent solutions, heated, stirred and sampled continuously using a modified liquid handling. The high-throughput platform is highly promising in enhancing synthesis of catalysts and their screening. In this paper the preparation of lanthanum-exchanged NaY zeolites (LaNaY) on the platform is reported, along with their use as catalyst for the conversion of renewables

  5. A Self-Reporting Photocatalyst for Online Fluorescence Monitoring of High Throughput RAFT Polymerization.

    Science.gov (United States)

    Yeow, Jonathan; Joshi, Sanket; Chapman, Robert; Boyer, Cyrille Andre Jean Marie

    2018-04-25

    Translating controlled/living radical polymerization (CLRP) from batch to the high throughput production of polymer libraries presents several challenges in terms of both polymer synthesis and characterization. Although recently there have been significant advances in the field of low volume, high throughput CLRP, techniques able to simultaneously monitor multiple polymerizations in an "online" manner have not yet been developed. Here, we report our discovery that 5,10,15,20-tetraphenyl-21H,23H-porphine zinc (ZnTPP) is a self-reporting photocatalyst that can mediate PET-RAFT polymerization as well as report on monomer conversion via changes in its fluorescence properties. This enables the use of a microplate reader to conduct high throughput "online" monitoring of PET-RAFT polymerizations performed directly in 384-well, low volume microtiter plates. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Cantilever-type electrode array-based high-throughput microparticle sorting platform driven by gravitation and negative dielectrophoretic force

    International Nuclear Information System (INIS)

    Kim, Youngho; Kim, Byungkyu; Lee, Junghun; Kim, Younggeun; Shin, Sang-Mo

    2011-01-01

    In this paper, we describe a cantilever-type electrode (CE) array-based high-throughput sorting platform, which is a tool used to separate microparticles using gravitation and negative dielectrophoretic (n-DEP) force. This platform consists of meso-size channels and a CE array, which is designed to separate a large number of target particles by differences in their dielectric material properties (DMP) and the weight of the particles. We employ a two-step separation process, with sedimentation as the first step and n-DEP as the second step. In order to differentiate the weight and the DMP of each particle, we employ the sedimentation phenomena in a vertical channel and the CE-based n-DEP in an inclined channel. By using three kinds of polystyrene beads with diameters of 10, 25 and 50 µm, the optimal population (10 7 beads ml −1 ) of particles and the appropriate length (25 mm) of the vertical channel for high performance were determined experimentally. Conclusively, by combining sedimentation and n-DEP schemes, we achieve 74.5, 94.7 and 100% separation efficiency for sorting microparticles with a diameter of 10, 25 and 50 µm, respectively.

  7. Production of high intensity radioactive beams

    International Nuclear Information System (INIS)

    Nitschke, J.M.

    1990-04-01

    The production of radioactive nuclear beams world-wide is reviewed. The projectile fragmentation and the ISOL approaches are discussed in detail, and the luminosity parameter is used throughout to compare different production methods. In the ISOL approach a thin and a thick target option are distinguished. The role of storage rings in radioactive beam research is evaluated. It is concluded that radioactive beams produced by the projectile fragmentation and the ISOL methods have complementary characteristics and can serve to answer different scientific questions. The decision which kind of facility to build has to depend on the significance and breadth of these questions. Finally a facility for producing a high intensity radioactive beams near the Coulomb barrier is proposed, with an expected luminosity of ∼10 39 cm -2 s -1 , which would yield radioactive beams in excess of 10 11 s -1 . 9 refs., 3 figs., 7 tabs

  8. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  9. Theory and implementation of a very high throughput true random number generator in field programmable gate array

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong; Xu, Chao [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China)

    2016-04-15

    The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.

  10. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  11. A quantification method for peroxyacetyl nitrate (PAN) using gas chromatography (GC) with a non-radioactive pulsed discharge detector (PDD)

    Science.gov (United States)

    Zhang, Lei; Jaffe, Daniel A.; Gao, Xin; McClure, Crystal D.

    2018-04-01

    In this study, we developed a method for continuous PAN measurements by gas chromatography (GC) with a non-radioactive pulsed discharge detector (PDD). Operational parameters were optimized based on the ratio of peak height over baseline noise (P/N ratio). The GC/PDD system was compared with a traditional radioactive electron-capture detector (ECD). In the lab, the method detection limit (MDL) of the new GC/PDD method (9 pptv) was lower than the radioactive GC/ECD method (15 pptv), demonstrating its excellent potential. The MDL of GC/PDD in the field campaign at the Mt. Bachelor Observatory (MBO) was 23 pptv, higher than in the lab. This was caused in part by the decreased slope of the calibration curve resulting from the low air pressure level at MBO. However, the MDL level of GC/PDD at MBO is still low enough for accurate PAN measurements, although special attention should be paid to its application at high-elevation sites. Observations of PAN were conducted at MBO in the summer of 2016 with the GC/PDD system, and provided more evidence of the performance of the system. PAN was found to be highly correlated with CO. The promising performance of GC/PDD which does not require a radioactive source makes it a useful approach for accurate PAN measurements in the field.

  12. A gas trapping method for high-throughput metabolic experiments.

    Science.gov (United States)

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  13. High-throughput technology for novel SO2 oxidation catalysts

    International Nuclear Information System (INIS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO 2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO 2 to SO 3 . High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO 2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO 2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO 3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. (topical review)

  14. Prospects for high-power radioactive beam facilities worldwide

    CERN Document Server

    Nolen, Jerry A

    2003-01-01

    Advances in accelerators, targets, ion sources, and experimental instrumentation are making possible ever more powerful facilities for basic and applied research with short-lived radioactive isotopes. There are several current generation facilities, based on a variety of technologies, operating worldwide. These include, for example, those based on the in-flight method such as the recently upgraded National Superconducting Cyclotron Laboratory at Michigan State University, the facility at RIKEN in Japan, GANIL in Caen, France, and GSI in Darmstadt, Germany. Present facilities based on the Isotope-Separator On-Line method include, for example, the ISOLDE laboratory at CERN, HRIBF at Oak Ridge, and the new high-power facility ISAC at TRIUMF in Vancouver. Next-generation facilities include the Radioactive-Ion Factory upgrade of RIKEN to higher energy and intensity and the upgrade of ISAC to a higher energy secondary beam; both of these projects are in progress. A new project, LINAG, to upgrade the capabilities at...

  15. Upscaling and automation of electrophysiology: toward high throughput screening in ion channel drug discovery

    DEFF Research Database (Denmark)

    Asmild, Margit; Oswald, Nicholas; Krzywkowski, Karen M

    2003-01-01

    by developing two lines of automated patch clamp products, a traditional pipette-based system called Apatchi-1, and a silicon chip-based system QPatch. The degree of automation spans from semi-automation (Apatchi-1) where a trained technician interacts with the system in a limited way, to a complete automation...... (QPatch 96) where the system works continuously and unattended until screening of a full compound library is completed. The performance of the systems range from medium to high throughputs....

  16. Final repositories for high-level radioactive waste; Endlagerung hochradioaktiver Abfaelle

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-10-15

    The brochure on final repositories for high-level radioactive waste covers the following issues: What is the origin of radioactive wastes? How large are the waste amounts? What is going to happen with the wastes? What is the solution for the Waste disposal? A new site search is started. Safety requirements for the final disposal of high-level radioactive wastes. Comparison of host rocks. Who is responsible and who will pay? Final disposal of high-level radioactive wastes worldwide. Short summary: History of the search for a final repository for high-level radioactive wastes in Germany.

  17. High Throughput In Situ XAFS Screening of Catalysts

    International Nuclear Information System (INIS)

    Tsapatsaris, Nikolaos; Beesley, Angela M.; Weiher, Norbert; Tatton, Helen; Schroeder, Sven L. M.; Dent, Andy J.; Mosselmans, Frederick J. W.; Tromp, Moniek; Russu, Sergio; Evans, John; Harvey, Ian; Hayama, Shu

    2007-01-01

    We outline and demonstrate the feasibility of high-throughput (HT) in situ XAFS for synchrotron radiation studies. An XAS data acquisition and control system for the analysis of dynamic materials libraries under control of temperature and gaseous environments has been developed. The system is compatible with the 96-well industry standard and coupled to multi-stream quadrupole mass spectrometry (QMS) analysis of reactor effluents. An automated analytical workflow generates data quickly compared to traditional individual spectrum acquisition and analyses them in quasi-real time using an HT data analysis tool based on IFFEFIT. The system was used for the automated characterization of a library of 91 catalyst precursors containing ternary combinations of Cu, Pt, and Au on γ-Al2O3, and for the in situ characterization of Au catalysts supported on Al2O3 and TiO2

  18. Proposed high throughput electrorefining treatment for spent N- Reactor fuel

    International Nuclear Information System (INIS)

    Gay, E.C.; Miller, W.E.; Laidler, J.J.

    1996-01-01

    A high-throughput electrorefining process is being adapted to treat spent N-Reactor fuel for ultimate disposal in a geologic repository. Anodic dissolution tests were made with unirradiated N-Reactor fuel to determine the type of fragmentation necessary to provide fuel segments suitable for this process. Based on these tests, a conceptual design was produced of a plant-scale electrorefiner. In this design, the diameter of an electrode assembly is about 1.07 m (42 in.). Three of these assemblies in an electrorefiner would accommodate a 3-metric-ton batch of N-Reactor fuel that would be processed at a rate of 42 kg of uranium per hour

  19. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    Directory of Open Access Journals (Sweden)

    Prasanth VP

    2006-08-01

    Full Text Available Abstract Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat genotyping data from the legume (chickpea, groundnut and pigeonpea and cereal (sorghum and millets crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping

  20. A proposed classification system for high-level and other radioactive wastes

    International Nuclear Information System (INIS)

    Kocher, D.C.; Croff, A.G.

    1989-01-01

    On the basis of the definition of high-level wastes (HLW) in the Nuclear Waste Policy Act of 1982 and previous descriptions of reprocessing wastes, a definition is proposed based on the concept that HLW is any waste which is highly radioactive and requires permanent isolation. This conceptual definition of HLW leads to a two-dimensional waste classification system in which one axis, related to 'highly radioactive', is associated with shorter-term risks from waste management and disposal due to high levels of decay heat and external radiation, and the other axis, related to 'requires permanent isolation', is associated with longer-term risks from waste disposal. Wastes that are highly radioactive are defined quantitatively as wastes with a decay heat (power density) greater than 50 W/m 3 or an external dose-equivalent rate greater than 100 rem/h (1 Sv/h) at a distance of 1 m from the waste, whichever is more restrictive. Wastes that require permanent isolation are defined quantitatively as wastes with concentrations of radionuclides greater than the Class-C limits that are generally acceptable for near-surface land disposal, as obtained from the Nuclear Regulatory Commission's 10 CFR Part 61 and its associated methodology. This proposal leads to similar definitions of two other waste classes: transuranic (TRU) waste and equivalent is any waste that requires permanent isolation but is not highly radioactive; and low-level waste (LLW) is any waste that does not require permanent isolation, without regard to whether or not it is highly radioactive. 31 refs.; 3 figs.; 4 tabs

  1. Alignment of time-resolved data from high throughput experiments.

    Science.gov (United States)

    Abidi, Nada; Franke, Raimo; Findeisen, Peter; Klawonn, Frank

    2016-12-01

    To better understand the dynamics of the underlying processes in cells, it is necessary to take measurements over a time course. Modern high-throughput technologies are often used for this purpose to measure the behavior of cell products like metabolites, peptides, proteins, [Formula: see text]RNA or mRNA at different points in time. Compared to classical time series, the number of time points is usually very limited and the measurements are taken at irregular time intervals. The main reasons for this are the costs of the experiments and the fact that the dynamic behavior usually shows a strong reaction and fast changes shortly after a stimulus and then slowly converges to a certain stable state. Another reason might simply be missing values. It is common to repeat the experiments and to have replicates in order to carry out a more reliable analysis. The ideal assumptions that the initial stimulus really started exactly at the same time for all replicates and that the replicates are perfectly synchronized are seldom satisfied. Therefore, there is a need to first adjust or align the time-resolved data before further analysis is carried out. Dynamic time warping (DTW) is considered as one of the common alignment techniques for time series data with equidistant time points. In this paper, we modified the DTW algorithm so that it can align sequences with measurements at different, non-equidistant time points with large gaps in between. This type of data is usually known as time-resolved data characterized by irregular time intervals between measurements as well as non-identical time points for different replicates. This new algorithm can be easily used to align time-resolved data from high-throughput experiments and to come across existing problems such as time scarcity and existing noise in the measurements. We propose a modified method of DTW to adapt requirements imposed by time-resolved data by use of monotone cubic interpolation splines. Our presented approach

  2. Portable non-destructive assay methods for screening and segregation of radioactive waste

    International Nuclear Information System (INIS)

    Simpson, Alan; Jones, Stephanie; Clapham, Martin; Lucero, Randy

    2011-01-01

    Significant cost-savings and operational efficiency may be realised by performing rapid non-destructive classification of radioactive waste at or near its point of retrieval or generation. There is often a need to quickly categorize and segregate bulk containers (drums, crates etc.) into waste streams defined at various boundary levels (based on its radioactive hazard) in order to meet disposal regulations and consignor waste acceptance criteria. Recent improvements in gamma spectroscopy technologies have provided the capability to perform rapid in-situ analysis using portable and hand-held devices such as battery-operated medium and high resolution detectors including lanthanum halide and high purity germanium (HPGe). Instruments and technologies that were previously the domain of complex lab systems are now widely available as touch-screen 'off-the-shelf' units. Despite such advances, the task of waste stream screening and segregation remains a complex exercise requiring a detailed understanding of programmatic requirements and, in particular, the capability to ensure data quality when operating in the field. This is particularly so when surveying historical waste drums and crates containing heterogeneous debris of unknown composition. The most widely used portable assay method is based upon far-field High Resolution Gamma Spectroscopy (HRGS) assay using HPGe detectors together with a well engineered deployment cart (such as the PSC TechniCART TM technology). Hand-held Sodium Iodide (NaI) detectors are often also deployed and may also be used to supplement the HPGe measurements in locating hot spots. Portable neutron slab monitors may also be utilised in cases where gamma measurements alone are not suitable. Several case histories are discussed at various sites where this equipment has been used for in-situ characterization of debris waste, sludge, soil, high activity waste, depleted and enriched uranium, heat source and weapons grade plutonium, fission products

  3. A Fast General-Purpose Clustering Algorithm Based on FPGAs for High-Throughput Data Processing

    CERN Document Server

    Annovi, A; The ATLAS collaboration; Castegnaro, A; Gatta, M

    2012-01-01

    We present a fast general-purpose algorithm for high-throughput clustering of data ”with a two dimensional organization”. The algorithm is designed to be implemented with FPGAs or custom electronics. The key feature is a processing time that scales linearly with the amount of data to be processed. This means that clustering can be performed in pipeline with the readout, without suffering from combinatorial delays due to looping multiple times through all the data. This feature makes this algorithm especially well suited for problems where the data has high density, e.g. in the case of tracking devices working under high-luminosity condition such as those of LHC or Super-LHC. The algorithm is organized in two steps: the first step (core) clusters the data; the second step analyzes each cluster of data to extract the desired information. The current algorithm is developed as a clustering device for modern high-energy physics pixel detectors. However, the algorithm has much broader field of applications. In ...

  4. Combinatorial electrochemical cell array for high throughput screening of micro-fuel-cells and metal/air batteries.

    Science.gov (United States)

    Jiang, Rongzhong

    2007-07-01

    An electrochemical cell array was designed that contains a common air electrode and 16 microanodes for high throughput screening of both fuel cells (based on polymer electrolyte membrane) and metal/air batteries (based on liquid electrolyte). Electrode materials can easily be coated on the anodes of the electrochemical cell array and screened by switching a graphite probe from one cell to the others. The electrochemical cell array was used to study direct methanol fuel cells (DMFCs), including high throughput screening of electrode catalysts and determination of optimum operating conditions. For screening of DMFCs, there is about 6% relative standard deviation (percentage of standard deviation versus mean value) for discharge current from 10 to 20 mAcm(2). The electrochemical cell array was also used to study tin/air batteries. The effect of Cu content in the anode electrode on the discharge performance of the tin/air battery was investigated. The relative standard deviations for screening of metal/air battery (based on zinc/air) are 2.4%, 3.6%, and 5.1% for discharge current at 50, 100, and 150 mAcm(2), respectively.

  5. High-throughput screening of carbohydrate-degrading enzymes using novel insoluble chromogenic substrate assay kits

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Willats, William George Tycho

    2016-01-01

    for this is that advances in genome and transcriptome sequencing, together with associated bioinformatics tools allow for rapid identification of candidate CAZymes, but technology for determining an enzyme's biochemical characteristics has advanced more slowly. To address this technology gap, a novel high-throughput assay...... CPH and ICB substrates are provided in a 96-well high-throughput assay system. The CPH substrates can be made in four different colors, enabling them to be mixed together and thus increasing assay throughput. The protocol describes a 96-well plate assay and illustrates how this assay can be used...... for screening the activities of enzymes, enzyme cocktails, and broths....

  6. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  7. The application of the high throughput sequencing technology in the transposable elements.

    Science.gov (United States)

    Liu, Zhen; Xu, Jian-hong

    2015-09-01

    High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.

  8. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  9. The Principals and Practice of Distributed High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The potential of Distributed Processing Systems to deliver computing capabilities with qualities ranging from high availability and reliability to easy expansion in functionality and capacity were recognized and formalized in the 1970’s. For more three decade these principals Distributed Computing guided the development of the HTCondor resource and job management system. The widely adopted suite of software tools offered by HTCondor are based on novel distributed computing technologies and are driven by the evolving needs of High Throughput scientific applications. We will review the principals that underpin our work, the distributed computing frameworks and technologies we developed and the lessons we learned from delivering effective and dependable software tools in an ever changing landscape computing technologies and needs that range today from a desktop computer to tens of thousands of cores offered by commercial clouds. About the speaker Miron Livny received a B.Sc. degree in Physics and Mat...

  10. High-Throughput Sequencing, a VersatileWeapon to Support Genome-Based Diagnosis in Infectious Diseases: Applications to Clinical Bacteriology

    Directory of Open Access Journals (Sweden)

    Ségolène Caboche

    2014-04-01

    Full Text Available The recent progresses of high-throughput sequencing (HTS technologies enable easy and cost-reduced access to whole genome sequencing (WGS or re-sequencing. HTS associated with adapted, automatic and fast bioinformatics solutions for sequencing applications promises an accurate and timely identification and characterization of pathogenic agents. Many studies have demonstrated that data obtained from HTS analysis have allowed genome-based diagnosis, which has been consistent with phenotypic observations. These proofs of concept are probably the first steps toward the future of clinical microbiology. From concept to routine use, many parameters need to be considered to promote HTS as a powerful tool to help physicians and clinicians in microbiological investigations. This review highlights the milestones to be completed toward this purpose.

  11. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  12. High-Throughput Cancer Cell Sphere Formation for 3D Cell Culture.

    Science.gov (United States)

    Chen, Yu-Chih; Yoon, Euisik

    2017-01-01

    Three-dimensional (3D) cell culture is critical in studying cancer pathology and drug response. Though 3D cancer sphere culture can be performed in low-adherent dishes or well plates, the unregulated cell aggregation may skew the results. On contrary, microfluidic 3D culture can allow precise control of cell microenvironments, and provide higher throughput by orders of magnitude. In this chapter, we will look into engineering innovations in a microfluidic platform for high-throughput cancer cell sphere formation and review the implementation methods in detail.

  13. Vitrification of high-level radioactive and hazardous wastes

    International Nuclear Information System (INIS)

    Lutze, W.

    1993-12-01

    The main objective is to summarize work conducted on glasses as waste forms for high-level radioactive fission product solutions up to the late 1980's (section I and II). Section III addresses the question, whether waste forms designed for the immobilization of radioactive residues can be used for the same purpose for hazardous wastes. Of particular interest are those types of hazardous wastes, e.g., fly ashes from municipal combustion plants, easy to convert into glasses or ceramic materials. A large number of base glass compositions has been studied to vitrify waste from reprocessing but only borosilicate glasses with melting temperatures between 1100 C and 1200 C and very good hydrolytic stability is used today. (orig./HP) [de

  14. Association Study of Gut Flora in Coronary Heart Disease through High-Throughput Sequencing

    OpenAIRE

    Cui, Li; Zhao, Tingting; Hu, Haibing; Zhang, Wen; Hua, Xiuguo

    2017-01-01

    Objectives. We aimed to explore the impact of gut microbiota in coronary heart disease (CHD) patients through high-throughput sequencing. Methods. A total of 29 CHD in-hospital patients and 35 healthy volunteers as controls were included. Nucleic acids were extracted from fecal samples, followed by ? diversity and principal coordinate analysis (PCoA). Based on unweighted UniFrac distance matrices, unweighted-pair group method with arithmetic mean (UPGMA) trees were created. Results. After dat...

  15. Targeted DNA Methylation Analysis by High Throughput Sequencing in Porcine Peri-attachment Embryos

    OpenAIRE

    MORRILL, Benson H.; COX, Lindsay; WARD, Anika; HEYWOOD, Sierra; PRATHER, Randall S.; ISOM, S. Clay

    2013-01-01

    Abstract The purpose of this experiment was to implement and evaluate the effectiveness of a next-generation sequencing-based method for DNA methylation analysis in porcine embryonic samples. Fourteen discrete genomic regions were amplified by PCR using bisulfite-converted genomic DNA derived from day 14 in vivo-derived (IVV) and parthenogenetic (PA) porcine embryos as template DNA. Resulting PCR products were subjected to high-throughput sequencing using the Illumina Genome Analyzer IIx plat...

  16. Ultraspecific probes for high throughput HLA typing

    Directory of Open Access Journals (Sweden)

    Eggers Rick

    2009-02-01

    Full Text Available Abstract Background The variations within an individual's HLA (Human Leukocyte Antigen genes have been linked to many immunological events, e.g. susceptibility to disease, response to vaccines, and the success of blood, tissue, and organ transplants. Although the microarray format has the potential to achieve high-resolution typing, this has yet to be attained due to inefficiencies of current probe design strategies. Results We present a novel three-step approach for the design of high-throughput microarray assays for HLA typing. This approach first selects sequences containing the SNPs present in all alleles of the locus of interest and next calculates the number of base changes necessary to convert a candidate probe sequences to the closest subsequence within the set of sequences that are likely to be present in the sample including the remainder of the human genome in order to identify those candidate probes which are "ultraspecific" for the allele of interest. Due to the high specificity of these sequences, it is possible that preliminary steps such as PCR amplification are no longer necessary. Lastly, the minimum number of these ultraspecific probes is selected such that the highest resolution typing can be achieved for the minimal cost of production. As an example, an array was designed and in silico results were obtained for typing of the HLA-B locus. Conclusion The assay presented here provides a higher resolution than has previously been developed and includes more alleles than previously considered. Based upon the in silico and preliminary experimental results, we believe that the proposed approach can be readily applied to any highly polymorphic gene system.

  17. Bidirectional User Throughput Maximization Based on Feedback Reduction in LiFi Networks

    OpenAIRE

    Soltani, Mohammad Dehghani; Wu, Xiping; Safari, Majid; Haas, Harald

    2017-01-01

    Channel adaptive signalling, which is based on feedback, can result in almost any performance metric enhancement. Unlike the radio frequency (RF) channel, the optical wireless communications (OWCs) channel is fairly static. This feature enables a potential improvement of the bidirectional user throughput by reducing the amount of feedback. Light-Fidelity (LiFi) is a subset of OWCs, and it is a bidirectional, high-speed and fully networked wireless communication technology where visible light ...

  18. Complete inactivation of HIV-1 using photo-labeled non-nucleoside reverse transcriptase inhibitors.

    Science.gov (United States)

    Rios, Adan; Quesada, Jorge; Anderson, Dallas; Goldstein, Allan; Fossum, Theresa; Colby-Germinario, Susan; Wainberg, Mark A

    2011-01-01

    We demonstrate that a photo-labeled derivative of the non-nucleoside reverse transcriptase inhibitor (NNRTI) dapivirine termed DAPY, when used together with exposure to ultraviolet light, was able to completely and irreversibly inactivate both HIV-1 RT activity as well as infectiousness in each of a T cell line and peripheral blood mononuclear cells. Control experiments using various concentrations of DAPY revealed that a combination of exposure to ultraviolet light together with use of the specific, high affinity photo-labeled compound was necessary for complete inactivation to occur. This method of HIV RT inactivation may have applicability toward preservation of an intact viral structure and warrants further investigation in regard to the potential of this approach to elicit a durable, broad protective immune response. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Probabilistic Methods for Processing High-Throughput Sequencing Signals

    DEFF Research Database (Denmark)

    Sørensen, Lasse Maretty

    High-throughput sequencing has the potential to answer many of the big questions in biology and medicine. It can be used to determine the ancestry of species, to chart complex ecosystems and to understand and diagnose disease. However, going from raw sequencing data to biological or medical insig....... By estimating the genotypes on a set of candidate variants obtained from both a standard mapping-based approach as well as de novo assemblies, we are able to find considerably more structural variation than previous studies...... for reconstructing transcript sequences from RNA sequencing data. The method is based on a novel sparse prior distribution over transcript abundances and is markedly more accurate than existing approaches. The second chapter describes a new method for calling genotypes from a fixed set of candidate variants....... The method queries the reads using a graph representation of the variants and hereby mitigates the reference-bias that characterise standard genotyping methods. In the last chapter, we apply this method to call the genotypes of 50 deeply sequencing parent-offspring trios from the GenomeDenmark project...

  20. SINA: accurate high-throughput multiple sequence alignment of ribosomal RNA genes.

    Science.gov (United States)

    Pruesse, Elmar; Peplies, Jörg; Glöckner, Frank Oliver

    2012-07-15

    In the analysis of homologous sequences, computation of multiple sequence alignments (MSAs) has become a bottleneck. This is especially troublesome for marker genes like the ribosomal RNA (rRNA) where already millions of sequences are publicly available and individual studies can easily produce hundreds of thousands of new sequences. Methods have been developed to cope with such numbers, but further improvements are needed to meet accuracy requirements. In this study, we present the SILVA Incremental Aligner (SINA) used to align the rRNA gene databases provided by the SILVA ribosomal RNA project. SINA uses a combination of k-mer searching and partial order alignment (POA) to maintain very high alignment accuracy while satisfying high throughput performance demands. SINA was evaluated in comparison with the commonly used high throughput MSA programs PyNAST and mothur. The three BRAliBase III benchmark MSAs could be reproduced with 99.3, 97.6 and 96.1 accuracy. A larger benchmark MSA comprising 38 772 sequences could be reproduced with 98.9 and 99.3% accuracy using reference MSAs comprising 1000 and 5000 sequences. SINA was able to achieve higher accuracy than PyNAST and mothur in all performed benchmarks. Alignment of up to 500 sequences using the latest SILVA SSU/LSU Ref datasets as reference MSA is offered at http://www.arb-silva.de/aligner. This page also links to Linux binaries, user manual and tutorial. SINA is made available under a personal use license.

  1. Morphology control in polymer blend fibers—a high throughput computing approach

    Science.gov (United States)

    Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar

    2016-08-01

    Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.

  2. Modular high-throughput test stand for versatile screening of thin-film materials libraries

    International Nuclear Information System (INIS)

    Thienhaus, Sigurd; Hamann, Sven; Ludwig, Alfred

    2011-01-01

    Versatile high-throughput characterization tools are required for the development of new materials using combinatorial techniques. Here, we describe a modular, high-throughput test stand for the screening of thin-film materials libraries, which can carry out automated electrical, magnetic and magnetoresistance measurements in the temperature range of −40 to 300 °C. As a proof of concept, we measured the temperature-dependent resistance of Fe–Pd–Mn ferromagnetic shape-memory alloy materials libraries, revealing reversible martensitic transformations and the associated transformation temperatures. Magneto-optical screening measurements of a materials library identify ferromagnetic samples, whereas resistivity maps support the discovery of new phases. A distance sensor in the same setup allows stress measurements in materials libraries deposited on cantilever arrays. A combination of these methods offers a fast and reliable high-throughput characterization technology for searching for new materials. Using this approach, a composition region has been identified in the Fe–Pd–Mn system that combines ferromagnetism and martensitic transformation.

  3. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit.

    Science.gov (United States)

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-04-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. GROMACS is an open source and free software available from http://www.gromacs.org. Supplementary data are available at Bioinformatics online.

  4. Radioactive waste and transport. Chapter 6

    International Nuclear Information System (INIS)

    1978-01-01

    A brief definition of the nature of radioactive waste is followed by a more detailed discussion of high level waste, its composition the amounts involved, storage in liquid and in solid form and the storage of non-reprocessed spent fuel. The final disposal of high level waste in deep geological structures is then described, based on the Swedish KBS study. The effectiveness of the artificial and natural barriers in preventing the radioactive substances from reaching the biosphere is discussed. American and Swedish risk analyses are briefly discussed, and practical experience presented. Low and medium level wastes are thereafter treated in a similar, though briefer manner. Transport of radioactive materials, fresh fuel, spent fuel and waste is then dealt with. Regulations for the containers and their tests are briefly presented and the risk of accidents, theft and sabotage during transport are discussed. (JIW)

  5. Radioactive air emissions from non-uranium mining operations

    International Nuclear Information System (INIS)

    Silhanek, J.S.; Andrews, V.E.

    1981-01-01

    Section 122 of the Clean Air Act Amendments of 1977, Public Law 9595, directed the Administrator of the Environmental Protection Agency to review all relevant information and determine whether emissions of radioactive pollutants into ambient air will cause or contribute to air pollution which may reasonably be anticipated to endanger public health. A section of this document presented a theoretical analysis of the radioactive airborne emissions from several non-uranium mines including iron, copper, zinc, clay, limestone, fluorspar, and phosphate. Since 1978 EPA's Las Vegas Laboratory has been gathering field data on actual radionuclide emissions from these mines to support the earlier theoretical analysis. The purpose of this paper is to present the results of those field measurements in comparison with the assumed values for the theoretical analysis

  6. High-throughput differentiation of heparin from other glycosaminoglycans by pyrolysis mass spectrometry.

    Science.gov (United States)

    Nemes, Peter; Hoover, William J; Keire, David A

    2013-08-06

    Sensors with high chemical specificity and enhanced sample throughput are vital to screening food products and medical devices for chemical or biochemical contaminants that may pose a threat to public health. For example, the rapid detection of oversulfated chondroitin sulfate (OSCS) in heparin could prevent reoccurrence of heparin adulteration that caused hundreds of severe adverse events including deaths worldwide in 2007-2008. Here, rapid pyrolysis is integrated with direct analysis in real time (DART) mass spectrometry to rapidly screen major glycosaminoglycans, including heparin, chondroitin sulfate A, dermatan sulfate, and OSCS. The results demonstrate that, compared to traditional liquid chromatography-based analyses, pyrolysis mass spectrometry achieved at least 250-fold higher sample throughput and was compatible with samples volume-limited to about 300 nL. Pyrolysis yielded an abundance of fragment ions (e.g., 150 different m/z species), many of which were specific to the parent compound. Using multivariate and statistical data analysis models, these data enabled facile differentiation of the glycosaminoglycans with high throughput. After method development was completed, authentically contaminated samples obtained during the heparin crisis by the FDA were analyzed in a blinded manner for OSCS contamination. The lower limit of differentiation and detection were 0.1% (w/w) OSCS in heparin and 100 ng/μL (20 ng) OSCS in water, respectively. For quantitative purposes the linear dynamic range spanned approximately 3 orders of magnitude. Moreover, this chemical readout was successfully employed to find clues in the manufacturing history of the heparin samples that can be used for surveillance purposes. The presented technology and data analysis protocols are anticipated to be readily adaptable to other chemical and biochemical agents and volume-limited samples.

  7. Micro-patterned agarose gel devices for single-cell high-throughput microscopy of E. coli cells.

    Science.gov (United States)

    Priest, David G; Tanaka, Nobuyuki; Tanaka, Yo; Taniguchi, Yuichi

    2017-12-21

    High-throughput microscopy of bacterial cells elucidated fundamental cellular processes including cellular heterogeneity and cell division homeostasis. Polydimethylsiloxane (PDMS)-based microfluidic devices provide advantages including precise positioning of cells and throughput, however device fabrication is time-consuming and requires specialised skills. Agarose pads are a popular alternative, however cells often clump together, which hinders single cell quantitation. Here, we imprint agarose pads with micro-patterned 'capsules', to trap individual cells and 'lines', to direct cellular growth outwards in a straight line. We implement this micro-patterning into multi-pad devices called CapsuleHotel and LineHotel for high-throughput imaging. CapsuleHotel provides ~65,000 capsule structures per mm 2 that isolate individual Escherichia coli cells. In contrast, LineHotel provides ~300 line structures per mm that direct growth of micro-colonies. With CapsuleHotel, a quantitative single cell dataset of ~10,000 cells across 24 samples can be acquired and analysed in under 1 hour. LineHotel allows tracking growth of > 10 micro-colonies across 24 samples simultaneously for up to 4 generations. These easy-to-use devices can be provided in kit format, and will accelerate discoveries in diverse fields ranging from microbiology to systems and synthetic biology.

  8. NMR-Based Identification of Metabolites in Polar and Non-Polar Extracts of Avian Liver.

    Science.gov (United States)

    Fathi, Fariba; Brun, Antonio; Rott, Katherine H; Falco Cobra, Paulo; Tonelli, Marco; Eghbalnia, Hamid R; Caviedes-Vidal, Enrique; Karasov, William H; Markley, John L

    2017-11-16

    Metabolites present in liver provide important clues regarding the physiological state of an organism. The aim of this work was to evaluate a protocol for high-throughput NMR-based analysis of polar and non-polar metabolites from a small quantity of liver tissue. We extracted the tissue with a methanol/chloroform/water mixture and isolated the polar metabolites from the methanol/water layer and the non-polar metabolites from the chloroform layer. Following drying, we re-solubilized the fractions for analysis with a 600 MHz NMR spectrometer equipped with a 1.7 mm cryogenic probe. In order to evaluate the feasibility of this protocol for metabolomics studies, we analyzed the metabolic profile of livers from house sparrow ( Passer domesticus ) nestlings raised on two different diets: livers from 10 nestlings raised on a high protein diet (HP) for 4 d and livers from 12 nestlings raised on the HP diet for 3 d and then switched to a high carbohydrate diet (HC) for 1 d. The protocol enabled the detection of 52 polar and nine non-polar metabolites in ¹H NMR spectra of the extracts. We analyzed the lipophilic metabolites by one-way ANOVA to assess statistically significant concentration differences between the two groups. The results of our studies demonstrate that the protocol described here can be exploited for high-throughput screening of small quantities of liver tissue (approx. 100 mg wet mass) obtainable from small animals.

  9. Ion sources development at GANIL for radioactive beams and high charge state ions

    International Nuclear Information System (INIS)

    Leroy, R.; Barue, C.; Canet, C.; Dupuis, M.; Flambard, J.L.; Gaubert, G.; Gibouin, S.; Huguet, Y.; Jardin, P.; Lecesne, N.; Leherissier, P.; Lemagnen, F.; Pacquet, J.Y.; Pellemoine-Landre, F.; Rataud, J.P.; Saint-Laurent, M.G.; Villari, A.C.C.; Maunoury, L.

    2001-01-01

    The GANIL laboratory has in charge the production of ion beams for nuclear and non nuclear physics. This article reviews the last developments that are underway in the fields of radioactive ion beam production, increase of the metallic ion intensities and production of highly charges ion beams. (authors)

  10. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  11. Non-fuel cycle radioactive waste policy in Turkey

    International Nuclear Information System (INIS)

    Demirel, H.

    2003-01-01

    Radioactive wastes generated in Turkey are mostly low level radioactive waste generated from the operation of one research reactor, research centers and universities, hospitals, and from radiological application of various industries. Disused sealed sources which potentially represent medium and high radiological risks in Turkey are mainly Am-241, Ra-226, Kr-85, Co-60, Ir-192 and Cs-137. All radioactive waste produced in Turkey is collected, segregated, conditioned and stored at CWPSF. Main components of the facility are listed below: Liquid waste is treated in chemical processing unit where precipitation is applied. Compactable solids are compressed in a compaction cell. Spent sources are embedded into cement mortar with their original shielding. If the source activities are in several millicuries, sometimes dismantling is applied and segregated sources are conditioned in shielded drums. Due to increasing number of radiation and nuclear related activities, the waste facility of CNAEM is now becoming insufficient to meet the storage demand of the country. TAEA is now in a position to establish a new radioactive waste management facility and studies are now being carried out on the selection of best place for the final storage of processed radioactive wastes. Research and development studies in TAEA should continue in radioactive waste management with the aim of improving data, models, and concepts related to long-term safety of disposal of long-lived waste

  12. Tiered High-Throughput Screening Approach to Identify ...

    Science.gov (United States)

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  13. A comparison of high-throughput techniques for assaying circadian rhythms in plants.

    Science.gov (United States)

    Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony

    2015-01-01

    Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.

  14. High-throughput volumetric reconstruction for 3D wheat plant architecture studies

    Directory of Open Access Journals (Sweden)

    Wei Fang

    2016-09-01

    Full Text Available For many tiller crops, the plant architecture (PA, including the plant fresh weight, plant height, number of tillers, tiller angle and stem diameter, significantly affects the grain yield. In this study, we propose a method based on volumetric reconstruction for high-throughput three-dimensional (3D wheat PA studies. The proposed methodology involves plant volumetric reconstruction from multiple images, plant model processing and phenotypic parameter estimation and analysis. This study was performed on 80 Triticum aestivum plants, and the results were analyzed. Comparing the automated measurements with manual measurements, the mean absolute percentage error (MAPE in the plant height and the plant fresh weight was 2.71% (1.08cm with an average plant height of 40.07cm and 10.06% (1.41g with an average plant fresh weight of 14.06g, respectively. The root mean square error (RMSE was 1.37cm and 1.79g for the plant height and plant fresh weight, respectively. The correlation coefficients were 0.95 and 0.96 for the plant height and plant fresh weight, respectively. Additionally, the proposed methodology, including plant reconstruction, model processing and trait extraction, required only approximately 20s on average per plant using parallel computing on a graphics processing unit (GPU, demonstrating that the methodology would be valuable for a high-throughput phenotyping platform.

  15. High-throughput measurement of recombination rates and genetic interference in Saccharomyces cerevisiae.

    Science.gov (United States)

    Raffoux, Xavier; Bourge, Mickael; Dumas, Fabrice; Martin, Olivier C; Falque, Matthieu

    2018-06-01

    Allelic recombination owing to meiotic crossovers is a major driver of genome evolution, as well as a key player for the selection of high-performing genotypes in economically important species. Therefore, we developed a high-throughput and low-cost method to measure recombination rates and crossover patterning (including interference) in large populations of the budding yeast Saccharomyces cerevisiae. Recombination and interference were analysed by flow cytometry, which allows time-consuming steps such as tetrad microdissection or spore growth to be avoided. Moreover, our method can also be used to compare recombination in wild-type vs. mutant individuals or in different environmental conditions, even if the changes in recombination rates are small. Furthermore, meiotic mutants often present recombination and/or pairing defects affecting spore viability but our method does not involve growth steps and thus avoids filtering out non-viable spores. Copyright © 2018 John Wiley & Sons, Ltd.

  16. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  17. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  18. High-throughput assessment of context-dependent effects of chromatin proteins

    NARCIS (Netherlands)

    Brueckner, L. (Laura); Van Arensbergen, J. (Joris); Akhtar, W. (Waseem); L. Pagie (Ludo); B. van Steensel (Bas)

    2016-01-01

    textabstractBackground: Chromatin proteins control gene activity in a concerted manner. We developed a high-throughput assay to study the effects of the local chromatin environment on the regulatory activity of a protein of interest. The assay combines a previously reported multiplexing strategy

  19. High-throughput open source computational methods for genetics and genomics

    NARCIS (Netherlands)

    Prins, J.C.P.

    2015-01-01

    Biology is increasingly data driven by virtue of the development of high-throughput technologies, such as DNA and RNA sequencing. Computational biology and bioinformatics are scientific disciplines that cross-over between the disciplines of biology, informatics and statistics; which is clearly

  20. tcpl: The ToxCast Pipeline for High-Throughput Screening Data

    Science.gov (United States)

    Motivation: The large and diverse high-throughput chemical screening efforts carried out by the US EPAToxCast program requires an efficient, transparent, and reproducible data pipeline.Summary: The tcpl R package and its associated MySQL database provide a generalized platform fo...