WorldWideScience

Sample records for cost-effective high-throughput analysis

  1. High-throughput determination of vancomycin in human plasma by a cost-effective system of two-dimensional liquid chromatography.

    Science.gov (United States)

    Sheng, Yanghao; Zhou, Boting

    2017-05-26

    Therapeutic drug monitoring (TDM) is one of the most important services of clinical laboratories. Two main techniques are commonly used: the immunoassay and chromatography method. We have developed a cost-effective system of two-dimensional liquid chromatography with ultraviolet detection (2D-LC-UV) for high-throughput determination of vancomycin in human plasma that combines the automation and low start-up costs of the immunoassay with the high selectivity and sensitivity of the liquid chromatography coupled with mass spectrometric detection without incurring their disadvantages, achieving high cost-effectiveness. This 2D-LC system offers a large volume injection to provide sufficient sensitivity and uses simulated gradient peak compression technology to control peak broadening and to improve peak shape. A middle column was added to reduce the analysis cycle time and make it suitable for high-throughput routine clinical assays. The analysis cycle time was 4min and the peak width was 0.8min. Compared with other chromatographic methods that have been developed, the analysis cycle time and peak width for vancomycin was reduced significantly. The lower limit of quantification was 0.20μg/mL for vancomycin, which is the same as certain LC-MS/MS methods that have been recently developed and validated. The method is rapid, automated, and low-cost and has high selectivity and sensitivity for the quantification of vancomycin in human plasma, thus making it well-suited for use in hospital clinical laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  3. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    2009-12-01

    Full Text Available Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges.We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences.Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is superior to TU-counting plus Sanger

  4. Low Cost, High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI

    Science.gov (United States)

    2017-10-01

    greater gas polarizations and production amounts/ throughputs- benefiting in particular from the advent of com- pact, high-power, relatively low- cost ...Award Number: W81XWH-15-1-0271 TITLE: Low- Cost , High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI...DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the

  5. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  6. Low-Cost, High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI

    Science.gov (United States)

    2017-10-01

    low- cost and high-throughput was a key element proposed for this project, which we believe will be of significant benefit to the patients suffering...Award Number: W81XWH-15-1-0272 TITLE: Low- Cost , High-Throughput 3-D Pulmonary Imager Using Hyperpolarized Contrast Agents and Low-Field MRI...STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the author(s

  7. (Super Variable Costing-Throughput Costing)

    OpenAIRE

    Çakıcı, Cemal

    2006-01-01

    (Super Variable Costing-Throughput Costing) The aim of this study is to explain the super-variable costing method which is a new subject in cost and management accounting and to show it’s working practicly.Shortly, super-variable costing can be defined as a costing method which is use only direct material costs in calculate of product costs and treats all costs except these (direct labor and overhead) as periad costs or operating costs.By using super-variable costing method, product costs ar...

  8. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  9. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  10. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  11. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  12. High throughput materials research and development for lithium ion batteries

    Directory of Open Access Journals (Sweden)

    Parker Liu

    2017-09-01

    Full Text Available Development of next generation batteries requires a breakthrough in materials. Traditional one-by-one method, which is suitable for synthesizing large number of sing-composition material, is time-consuming and costly. High throughput and combinatorial experimentation, is an effective method to synthesize and characterize huge amount of materials over a broader compositional region in a short time, which enables to greatly speed up the discovery and optimization of materials with lower cost. In this work, high throughput and combinatorial materials synthesis technologies for lithium ion battery research are discussed, and our efforts on developing such instrumentations are introduced.

  13. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  14. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  15. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  16. A low-cost, portable, high-throughput wireless sensor system for phonocardiography applications.

    Science.gov (United States)

    Sa-Ngasoongsong, Akkarapol; Kunthong, Jakkrit; Sarangan, Venkatesh; Cai, Xinwei; Bukkapatnam, Satish T S

    2012-01-01

    This paper presents the design and testing of a wireless sensor system developed using a Microchip PICDEM developer kit to acquire and monitor human heart sounds for phonocardiography applications. This system can serve as a cost-effective option to the recent developments in wireless phonocardiography sensors that have primarily focused on Bluetooth technology. This wireless sensor system has been designed and developed in-house using off-the-shelf components and open source software for remote and mobile applications. The small form factor (3.75 cm × 5 cm × 1 cm), high throughput (6,000 Hz data streaming rate), and low cost ($13 per unit for a 1,000 unit batch) of this wireless sensor system make it particularly attractive for phonocardiography and other sensing applications. The experimental results of sensor signal analysis using several signal characterization techniques suggest that this wireless sensor system can capture both fundamental heart sounds (S1 and S2), and is also capable of capturing abnormal heart sounds (S3 and S4) and heart murmurs without aliasing. The results of a denoising application using Wavelet Transform show that the undesirable noises of sensor signals in the surrounding environment can be reduced dramatically. The exercising experiment results also show that this proposed wireless PCG system can capture heart sounds over different heart conditions simulated by varying heart rates of six subjects over a range of 60-180 Hz through exercise testing.

  17. A Low-Cost, Portable, High-Throughput Wireless Sensor System for Phonocardiography Applications

    Directory of Open Access Journals (Sweden)

    Akkarapol Sa-ngasoongsong

    2012-08-01

    Full Text Available This paper presents the design and testing of a wireless sensor system developed using a Microchip PICDEM developer kit to acquire and monitor human heart sounds for phonocardiography applications. This system can serve as a cost-effective option to the recent developments in wireless phonocardiography sensors that have primarily focused on Bluetooth technology. This wireless sensor system has been designed and developed in-house using off-the-shelf components and open source software for remote and mobile applications. The small form factor (3.75 cm ´ 5 cm ´ 1 cm, high throughput (6,000 Hz data streaming rate, and low cost ($13 per unit for a 1,000 unit batch of this wireless sensor system make it particularly attractive for phonocardiography and other sensing applications. The experimental results of sensor signal analysis using several signal characterization techniques suggest that this wireless sensor system can capture both fundamental heart sounds (S1 and S2, and is also capable of capturing abnormal heart sounds (S3 and S4 and heart murmurs without aliasing. The results of a denoising application using Wavelet Transform show that the undesirable noises of sensor signals in the surrounding environment can be reduced dramatically. The exercising experiment results also show that this proposed wireless PCG system can capture heart sounds over different heart conditions simulated by varying heart rates of six subjects over a range of 60–180 Hz through exercise testing.

  18. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  19. High-throughput single nucleotide polymorphism genotyping using nanofluidic Dynamic Arrays

    Directory of Open Access Journals (Sweden)

    Crenshaw Andrew

    2009-01-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs have emerged as the genetic marker of choice for mapping disease loci and candidate gene association studies, because of their high density and relatively even distribution in the human genomes. There is a need for systems allowing medium multiplexing (ten to hundreds of SNPs with high throughput, which can efficiently and cost-effectively generate genotypes for a very large sample set (thousands of individuals. Methods that are flexible, fast, accurate and cost-effective are urgently needed. This is also important for those who work on high throughput genotyping in non-model systems where off-the-shelf assays are not available and a flexible platform is needed. Results We demonstrate the use of a nanofluidic Integrated Fluidic Circuit (IFC - based genotyping system for medium-throughput multiplexing known as the Dynamic Array, by genotyping 994 individual human DNA samples on 47 different SNP assays, using nanoliter volumes of reagents. Call rates of greater than 99.5% and call accuracies of greater than 99.8% were achieved from our study, which demonstrates that this is a formidable genotyping platform. The experimental set up is very simple, with a time-to-result for each sample of about 3 hours. Conclusion Our results demonstrate that the Dynamic Array is an excellent genotyping system for medium-throughput multiplexing (30-300 SNPs, which is simple to use and combines rapid throughput with excellent call rates, high concordance and low cost. The exceptional call rates and call accuracy obtained may be of particular interest to those working on validation and replication of genome- wide- association (GWA studies.

  20. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  1. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Science.gov (United States)

    Inagaki, Soichi; Henry, Isabelle M; Lieberman, Meric C; Comai, Luca

    2015-01-01

    Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  2. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  3. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  4. Throughput, latency and cost comparisons of microcontroller-based implementations of wireless sensor network (WSN) in high jump sports

    Science.gov (United States)

    Ahmad, Afandi; Roslan, Muhammad Faris; Amira, Abbes

    2017-09-01

    In high jump sports, approach take-off speed and force during the take-off are two (2) main important parts to gain maximum jump. To measure both parameters, wireless sensor network (WSN) that contains microcontroller and sensor are needed to describe the results of speed and force for jumpers. Most of the microcontroller exhibit transmission issues in terms of throughput, latency and cost. Thus, this study presents the comparison of wireless microcontrollers in terms of throughput, latency and cost, and the microcontroller that have best performances and cost will be implemented in high jump wearable device. In the experiments, three (3) parts have been integrated - input, process and output. Force (for ankle) and global positioning system (GPS) sensor (for body waist) acts as an input for data transmission. These data were then being processed by both microcontrollers, ESP8266 and Arduino Yun Mini to transmit the data from sensors to the server (host-PC) via message queuing telemetry transport (MQTT) protocol. The server acts as receiver and the results was calculated from the MQTT log files. At the end, results obtained have shown ESP8266 microcontroller had been chosen since it achieved high throughput, low latency and 11 times cheaper in term of prices compared to Arduino Yun Mini microcontroller.

  5. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Directory of Open Access Journals (Sweden)

    Soichi Inagaki

    Full Text Available Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  6. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  7. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  8. High-throughput fractionation of human plasma for fast enrichment of low- and high-abundance proteins.

    Science.gov (United States)

    Breen, Lucas; Cao, Lulu; Eom, Kirsten; Srajer Gajdosik, Martina; Camara, Lila; Giacometti, Jasminka; Dupuy, Damian E; Josic, Djuro

    2012-05-01

    Fast, cost-effective and reproducible isolation of IgM from plasma is invaluable to the study of IgM and subsequent understanding of the human immune system. Additionally, vast amounts of information regarding human physiology and disease can be derived from analysis of the low abundance proteome of the plasma. In this study, methods were optimized for both the high-throughput isolation of IgM from human plasma, and the high-throughput isolation and fractionation of low abundance plasma proteins. To optimize the chromatographic isolation of IgM from human plasma, many variables were examined including chromatography resin, mobile phases, and order of chromatographic separations. Purification of IgM was achieved most successfully through isolation of immunoglobulin from human plasma using Protein A chromatography with a specific resin followed by subsequent fractionation using QA strong anion exchange chromatography. Through these optimization experiments, an additional method was established to prepare plasma for analysis of low abundance proteins. This method involved chromatographic depletion of high-abundance plasma proteins and reduction of plasma proteome complexity through further chromatographic fractionation. Purification of IgM was achieved with high purity as confirmed by SDS-PAGE and IgM-specific immunoblot. Isolation and fractionation of low abundance protein was also performed successfully, as confirmed by SDS-PAGE and mass spectrometry analysis followed by label-free quantitative spectral analysis. The level of purity of the isolated IgM allows for further IgM-specific analysis of plasma samples. The developed fractionation scheme can be used for high throughput screening of human plasma in order to identify low and high abundance proteins as potential prognostic and diagnostic disease biomarkers.

  9. Effectiveness of a high-throughput genetic analysis in the identification of responders/non-responders to CYP2D6-metabolized drugs.

    Science.gov (United States)

    Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A

    2011-01-01

    Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.

  10. Development of a high-throughput real time PCR based on a hot-start alternative for Pfu mediated by quantum dots

    Science.gov (United States)

    Sang, Fuming; Yang, Yang; Yuan, Lin; Ren, Jicun; Zhang, Zhizhou

    2015-09-01

    Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour preincubation at 50 °C before real time PCR. Moreover, the results obtained by QD-based HS PCR were comparable to a commercial Taq antibody DNA polymerase. However, no obvious HS effect of QDs was found in real time PCR using Taq DNA polymerase. The findings of this study demonstrated that a cost-effective high-throughput real time PCR based on QD triggered HS PCR could be established with high consistency, sensitivity and accuracy.Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour

  11. Towards sensitive, high-throughput, biomolecular assays based on fluorescence lifetime

    Science.gov (United States)

    Ioanna Skilitsi, Anastasia; Turko, Timothé; Cianfarani, Damien; Barre, Sophie; Uhring, Wilfried; Hassiepen, Ulrich; Léonard, Jérémie

    2017-09-01

    Time-resolved fluorescence detection for robust sensing of biomolecular interactions is developed by implementing time-correlated single photon counting in high-throughput conditions. Droplet microfluidics is used as a promising platform for the very fast handling of low-volume samples. We illustrate the potential of this very sensitive and cost-effective technology in the context of an enzymatic activity assay based on fluorescently-labeled biomolecules. Fluorescence lifetime detection by time-correlated single photon counting is shown to enable reliable discrimination between positive and negative control samples at a throughput as high as several hundred samples per second.

  12. High-throughput STR analysis for DNA database using direct PCR.

    Science.gov (United States)

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to <3%. In conclusion, the newly developed direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner. © 2013 American Academy of Forensic Sciences Published 2013. This article is a U.S. Government work and is in the public domain in the U.S.A.

  13. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  14. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    Science.gov (United States)

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  15. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  16. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  17. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  18. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  19. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  20. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  1. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  2. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  3. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  4. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  5. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-01-01

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  6. Automated high-throughput measurement of body movements and cardiac activity of Xenopus tropicalis tadpoles

    Directory of Open Access Journals (Sweden)

    Kay Eckelt

    2014-07-01

    Full Text Available Xenopus tadpoles are an emerging model for developmental, genetic and behavioral studies. A small size, optical accessibility of most of their organs, together with a close genetic and structural relationship to humans make them a convenient experimental model. However, there is only a limited toolset available to measure behavior and organ function of these animals at medium or high-throughput. Herein, we describe an imaging-based platform to quantify body and autonomic movements of Xenopus tropicalis tadpoles of advanced developmental stages. Animals alternate periods of quiescence and locomotor movements and display buccal pumping for oxygen uptake from water and rhythmic cardiac movements. We imaged up to 24 animals in parallel and automatically tracked and quantified their movements by using image analysis software. Animal trajectories, moved distances, activity time, buccal pumping rates and heart beat rates were calculated and used to characterize the effects of test compounds. We evaluated the effects of propranolol and atropine, observing a dose-dependent bradycardia and tachycardia, respectively. This imaging and analysis platform is a simple, cost-effective high-throughput in vivo assay system for genetic, toxicological or pharmacological characterizations.

  7. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    Science.gov (United States)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  8. Cost-effectiveness analysis of online hemodiafiltration versus high-flux hemodialysis

    Directory of Open Access Journals (Sweden)

    Ramponi F

    2016-09-01

    Full Text Available Francesco Ramponi,1,2 Claudio Ronco,1,3 Giacomo Mason,1 Enrico Rettore,4 Daniele Marcelli,5,6 Francesca Martino,1,3 Mauro Neri,1,7 Alejandro Martin-Malo,8 Bernard Canaud,5,9 Francesco Locatelli10 1International Renal Research Institute (IRRIV, San Bortolo Hospital, Vicenza, 2Department of Economics and Management, University of Padova, Padova, 3Department of Nephrology, San Bortolo Hospital, Vicenza, 4Department of Sociology and Social Research, University of Trento, FBK-IRVAPP & IZA, Trento, Italy; 5Europe, Middle East, Africa and Latin America Medical Board, Fresenius Medical Care,, Bad Homburg, Germany; 6Danube University, Krems, Austria; 7Department of Management and Engineering, University of Padova, Vicenza, Italy; 8Nephrology Unit, Reina Sofia University Hospital, Córdoba, Spain; 9School of Medicine, Montpellier University, Montpellier, France; 10Department of Nephrology, Manzoni Hospital, Lecco, Italy Background: Clinical studies suggest that hemodiafiltration (HDF may lead to better clinical outcomes than high-flux hemodialysis (HF-HD, but concerns have been raised about the cost-effectiveness of HDF versus HF-HD. Aim of this study was to investigate whether clinical benefits, in terms of longer survival and better health-related quality of life, are worth the possibly higher costs of HDF compared to HF-HD.Methods: The analysis comprised a simulation based on the combined results of previous published studies, with the following steps: 1 estimation of the survival function of HF-HD patients from a clinical trial and of HDF patients using the risk reduction estimated in a meta-analysis; 2 simulation of the survival of the same sample of patients as if allocated to HF-HD or HDF using three-state Markov models; and 3 application of state-specific health-related quality of life coefficients and differential costs derived from the literature. Several Monte Carlo simulations were performed, including simulations for patients with different

  9. Targeted DNA Methylation Analysis by High Throughput Sequencing in Porcine Peri-attachment Embryos

    OpenAIRE

    MORRILL, Benson H.; COX, Lindsay; WARD, Anika; HEYWOOD, Sierra; PRATHER, Randall S.; ISOM, S. Clay

    2013-01-01

    Abstract The purpose of this experiment was to implement and evaluate the effectiveness of a next-generation sequencing-based method for DNA methylation analysis in porcine embryonic samples. Fourteen discrete genomic regions were amplified by PCR using bisulfite-converted genomic DNA derived from day 14 in vivo-derived (IVV) and parthenogenetic (PA) porcine embryos as template DNA. Resulting PCR products were subjected to high-throughput sequencing using the Illumina Genome Analyzer IIx plat...

  10. High-throughput analysis of amino acids in plant materials by single quadrupole mass spectrometry

    DEFF Research Database (Denmark)

    Dahl-Lassen, Rasmus; van Hecke, Jan Julien Josef; Jørgensen, Henning

    2018-01-01

    that it is very time consuming with typical chromatographic run times of 70 min or more. Results: We have here developed a high-throughput method for analysis of amino acid profiles in plant materials. The method combines classical protein hydrolysis and derivatization with fast separation by UHPLC and detection...... reducing the overall analytical costs compared to methods based on more advanced mass spectrometers....... by a single quadrupole (QDa) mass spectrometer. The chromatographic run time is reduced to 10 min and the precision, accuracy and sensitivity of the method are in line with other recent methods utilizing advanced and more expensive mass spectrometers. The sensitivity of the method is at least a factor 10...

  11. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  12. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  13. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  14. Bevacizumab in Treatment of High-Risk Ovarian Cancer—A Cost-Effectiveness Analysis

    Science.gov (United States)

    Herzog, Thomas J.; Hu, Lilian; Monk, Bradley J.; Kiet, Tuyen; Blansit, Kevin; Kapp, Daniel S.; Yu, Xinhua

    2014-01-01

    Objective. The objective of this study was to evaluate a cost-effectiveness strategy of bevacizumab in a subset of high-risk advanced ovarian cancer patients with survival benefit. Methods. A subset analysis of the International Collaboration on Ovarian Neoplasms 7 trial showed that additions of bevacizumab (B) and maintenance bevacizumab (mB) to paclitaxel (P) and carboplatin (C) improved the overall survival (OS) of high-risk advanced cancer patients. Actual and estimated costs of treatment were determined from Medicare payment. Incremental cost-effectiveness ratio per life-year saved was established. Results. The estimated cost of PC is $535 per cycle; PCB + mB (7.5 mg/kg) is $3,760 per cycle for the first 6 cycles and then $3,225 per cycle for 12 mB cycles. Of 465 high-risk stage IIIC (>1 cm residual) or stage IV patients, the previously reported OS after PC was 28.8 months versus 36.6 months in those who underwent PCB + mB. With an estimated 8-month improvement in OS, the incremental cost-effectiveness ratio of B was $167,771 per life-year saved. Conclusion. In this clinically relevant subset of women with high-risk advanced ovarian cancer with overall survival benefit after bevacizumab, our economic model suggests that the incremental cost of bevacizumab was approximately $170,000. PMID:24721817

  15. High-throughput assessment of context-dependent effects of chromatin proteins

    NARCIS (Netherlands)

    Brueckner, L. (Laura); Van Arensbergen, J. (Joris); Akhtar, W. (Waseem); L. Pagie (Ludo); B. van Steensel (Bas)

    2016-01-01

    textabstractBackground: Chromatin proteins control gene activity in a concerted manner. We developed a high-throughput assay to study the effects of the local chromatin environment on the regulatory activity of a protein of interest. The assay combines a previously reported multiplexing strategy

  16. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    International Nuclear Information System (INIS)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan; Durmus, Naside Gozde

    2011-01-01

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  17. Microengineering methods for cell-based microarrays and high-throughput drug-screening applications

    Energy Technology Data Exchange (ETDEWEB)

    Xu Feng; Wu Jinhui; Wang Shuqi; Gurkan, Umut Atakan; Demirci, Utkan [Department of Medicine, Demirci Bio-Acoustic-MEMS in Medicine (BAMM) Laboratory, Center for Biomedical Engineering, Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Durmus, Naside Gozde, E-mail: udemirci@rics.bwh.harvard.edu [School of Engineering and Division of Biology and Medicine, Brown University, Providence, RI (United States)

    2011-09-15

    Screening for effective therapeutic agents from millions of drug candidates is costly, time consuming, and often faces concerns due to the extensive use of animals. To improve cost effectiveness, and to minimize animal testing in pharmaceutical research, in vitro monolayer cell microarrays with multiwell plate assays have been developed. Integration of cell microarrays with microfluidic systems has facilitated automated and controlled component loading, significantly reducing the consumption of the candidate compounds and the target cells. Even though these methods significantly increased the throughput compared to conventional in vitro testing systems and in vivo animal models, the cost associated with these platforms remains prohibitively high. Besides, there is a need for three-dimensional (3D) cell-based drug-screening models which can mimic the in vivo microenvironment and the functionality of the native tissues. Here, we present the state-of-the-art microengineering approaches that can be used to develop 3D cell-based drug-screening assays. We highlight the 3D in vitro cell culture systems with live cell-based arrays, microfluidic cell culture systems, and their application to high-throughput drug screening. We conclude that among the emerging microengineering approaches, bioprinting holds great potential to provide repeatable 3D cell-based constructs with high temporal, spatial control and versatility.

  18. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  19. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  20. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  1. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  2. Cost-effectiveness Analysis for Technology Acquisition.

    Science.gov (United States)

    Chakravarty, A; Naware, S S

    2008-01-01

    In a developing country with limited resources, it is important to utilize the total cost visibility approach over the entire life-cycle of the technology and then analyse alternative options for acquiring technology. The present study analysed cost-effectiveness of an "In-house" magnetic resonance imaging (MRI) scan facility of a large service hospital against outsourcing possibilities. Cost per unit scan was calculated by operating costing method and break-even volume was calculated. Then life-cycle cost analysis was performed to enable total cost visibility of the MRI scan in both "In-house" and "outsourcing of facility" configuration. Finally, cost-effectiveness analysis was performed to identify the more acceptable decision option. Total cost for performing unit MRI scan was found to be Rs 3,875 for scans without contrast and Rs 4,129 with contrast. On life-cycle cost analysis, net present value (NPV) of the "In-house" configuration was found to be Rs-(4,09,06,265) while that of "outsourcing of facility" configuration was Rs-(5,70,23,315). Subsequently, cost-effectiveness analysis across eight Figures of Merit showed the "In-house" facility to be the more acceptable option for the system. Every decision for acquiring high-end technology must be subjected to life-cycle cost analysis.

  3. High-throughput genotyping for species identification and diversity assessment in germplasm collections.

    Science.gov (United States)

    Mason, Annaliese S; Zhang, Jing; Tollenaere, Reece; Vasquez Teuber, Paula; Dalton-Morgan, Jessica; Hu, Liyong; Yan, Guijun; Edwards, David; Redden, Robert; Batley, Jacqueline

    2015-09-01

    Germplasm collections provide an extremely valuable resource for breeders and researchers. However, misclassification of accessions by species often hinders the effective use of these collections. We propose that use of high-throughput genotyping tools can provide a fast, efficient and cost-effective way of confirming species in germplasm collections, as well as providing valuable genetic diversity data. We genotyped 180 Brassicaceae samples sourced from the Australian Grains Genebank across the recently released Illumina Infinium Brassica 60K SNP array. Of these, 76 were provided on the basis of suspected misclassification and another 104 were sourced independently from the germplasm collection. Presence of the A- and C-genomes combined with principle components analysis clearly separated Brassica rapa, B. oleracea, B. napus, B. carinata and B. juncea samples into distinct species groups. Several lines were further validated using chromosome counts. Overall, 18% of samples (32/180) were misclassified on the basis of species. Within these 180 samples, 23/76 (30%) supplied on the basis of suspected misclassification were misclassified, and 9/105 (9%) of the samples randomly sourced from the Australian Grains Genebank were misclassified. Surprisingly, several individuals were also found to be the product of interspecific hybridization events. The SNP (single nucleotide polymorphism) array proved effective at confirming species, and provided useful information related to genetic diversity. As similar genomic resources become available for different crops, high-throughput molecular genotyping will offer an efficient and cost-effective method to screen germplasm collections worldwide, facilitating more effective use of these valuable resources by breeders and researchers. © 2015 John Wiley & Sons Ltd.

  4. Correction of Microplate Data from High-Throughput Screening.

    Science.gov (United States)

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  5. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  6. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  7. Genetic high throughput screening in Retinitis Pigmentosa based on high resolution melting (HRM) analysis.

    Science.gov (United States)

    Anasagasti, Ander; Barandika, Olatz; Irigoyen, Cristina; Benitez, Bruno A; Cooper, Breanna; Cruchaga, Carlos; López de Munain, Adolfo; Ruiz-Ederra, Javier

    2013-11-01

    Retinitis Pigmentosa (RP) involves a group of genetically determined retinal diseases caused by a large number of mutations that result in rod photoreceptor cell death followed by gradual death of cone cells. Most cases of RP are monogenic, with more than 80 associated genes identified so far. The high number of genes and variants involved in RP, among other factors, is making the molecular characterization of RP a real challenge for many patients. Although HRM has been used for the analysis of isolated variants or single RP genes, as far as we are concerned, this is the first study that uses HRM analysis for a high-throughput screening of several RP genes. Our main goal was to test the suitability of HRM analysis as a genetic screening technique in RP, and to compare its performance with two of the most widely used NGS platforms, Illumina and PGM-Ion Torrent technologies. RP patients (n = 96) were clinically diagnosed at the Ophthalmology Department of Donostia University Hospital, Spain. We analyzed a total of 16 RP genes that meet the following inclusion criteria: 1) size: genes with transcripts of less than 4 kb; 2) number of exons: genes with up to 22 exons; and 3) prevalence: genes reported to account for, at least, 0.4% of total RP cases worldwide. For comparison purposes, RHO gene was also sequenced with Illumina (GAII; Illumina), Ion semiconductor technologies (PGM; Life Technologies) and Sanger sequencing (ABI 3130xl platform; Applied Biosystems). Detected variants were confirmed in all cases by Sanger sequencing and tested for co-segregation in the family of affected probands. We identified a total of 65 genetic variants, 15 of which (23%) were novel, in 49 out of 96 patients. Among them, 14 (4 novel) are probable disease-causing genetic variants in 7 RP genes, affecting 15 patients. Our HRM analysis-based study, proved to be a cost-effective and rapid method that provides an accurate identification of genetic RP variants. This approach is effective for

  8. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    Science.gov (United States)

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  9. A gas trapping method for high-throughput metabolic experiments.

    Science.gov (United States)

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  10. High Resolution Melting (HRM for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Directory of Open Access Journals (Sweden)

    Marcin Słomka

    2017-11-01

    Full Text Available High resolution melting (HRM is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs. This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  11. High Resolution Melting (HRM) for High-Throughput Genotyping—Limitations and Caveats in Practical Case Studies

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz

    2017-01-01

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791

  12. High Resolution Melting (HRM) for High-Throughput Genotyping-Limitations and Caveats in Practical Case Studies.

    Science.gov (United States)

    Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik

    2017-11-03

    High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.

  13. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules.

    Directory of Open Access Journals (Sweden)

    Antonino Ingargiola

    Full Text Available We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions.

  14. Enzyme free cloning for high throughput gene cloning and expression

    NARCIS (Netherlands)

    de Jong, R.N.; Daniëls, M.; Kaptein, R.; Folkers, G.E.

    2006-01-01

    Structural and functional genomics initiatives significantly improved cloning methods over the past few years. Although recombinational cloning is highly efficient, its costs urged us to search for an alternative high throughput (HTP) cloning method. We implemented a modified Enzyme Free Cloning

  15. GUItars: a GUI tool for analysis of high-throughput RNA interference screening data.

    Directory of Open Access Journals (Sweden)

    Asli N Goktug

    Full Text Available High-throughput RNA interference (RNAi screening has become a widely used approach to elucidating gene functions. However, analysis and annotation of large data sets generated from these screens has been a challenge for researchers without a programming background. Over the years, numerous data analysis methods were produced for plate quality control and hit selection and implemented by a few open-access software packages. Recently, strictly standardized mean difference (SSMD has become a widely used method for RNAi screening analysis mainly due to its better control of false negative and false positive rates and its ability to quantify RNAi effects with a statistical basis. We have developed GUItars to enable researchers without a programming background to use SSMD as both a plate quality and a hit selection metric to analyze large data sets.The software is accompanied by an intuitive graphical user interface for easy and rapid analysis workflow. SSMD analysis methods have been provided to the users along with traditionally-used z-score, normalized percent activity, and t-test methods for hit selection. GUItars is capable of analyzing large-scale data sets from screens with or without replicates. The software is designed to automatically generate and save numerous graphical outputs known to be among the most informative high-throughput data visualization tools capturing plate-wise and screen-wise performances. Graphical outputs are also written in HTML format for easy access, and a comprehensive summary of screening results is written into tab-delimited output files.With GUItars, we demonstrated robust SSMD-based analysis workflow on a 3840-gene small interfering RNA (siRNA library and identified 200 siRNAs that increased and 150 siRNAs that decreased the assay activities with moderate to stronger effects. GUItars enables rapid analysis and illustration of data from large- or small-scale RNAi screens using SSMD and other traditional analysis

  16. Evaluation of a pooled strategy for high-throughput sequencing of cosmid clones from metagenomic libraries.

    Science.gov (United States)

    Lam, Kathy N; Hall, Michael W; Engel, Katja; Vey, Gregory; Cheng, Jiujun; Neufeld, Josh D; Charles, Trevor C

    2014-01-01

    High-throughput sequencing methods have been instrumental in the growing field of metagenomics, with technological improvements enabling greater throughput at decreased costs. Nonetheless, the economy of high-throughput sequencing cannot be fully leveraged in the subdiscipline of functional metagenomics. In this area of research, environmental DNA is typically cloned to generate large-insert libraries from which individual clones are isolated, based on specific activities of interest. Sequence data are required for complete characterization of such clones, but the sequencing of a large set of clones requires individual barcode-based sample preparation; this can become costly, as the cost of clone barcoding scales linearly with the number of clones processed, and thus sequencing a large number of metagenomic clones often remains cost-prohibitive. We investigated a hybrid Sanger/Illumina pooled sequencing strategy that omits barcoding altogether, and we evaluated this strategy by comparing the pooled sequencing results to reference sequence data obtained from traditional barcode-based sequencing of the same set of clones. Using identity and coverage metrics in our evaluation, we show that pooled sequencing can generate high-quality sequence data, without producing problematic chimeras. Though caveats of a pooled strategy exist and further optimization of the method is required to improve recovery of complete clone sequences and to avoid circumstances that generate unrecoverable clone sequences, our results demonstrate that pooled sequencing represents an effective and low-cost alternative for sequencing large sets of metagenomic clones.

  17. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  18. Improvements and impacts of GRCh38 human reference on high throughput sequencing data analysis.

    Science.gov (United States)

    Guo, Yan; Dai, Yulin; Yu, Hui; Zhao, Shilin; Samuels, David C; Shyr, Yu

    2017-03-01

    Analyses of high throughput sequencing data starts with alignment against a reference genome, which is the foundation for all re-sequencing data analyses. Each new release of the human reference genome has been augmented with improved accuracy and completeness. It is presumed that the latest release of human reference genome, GRCh38 will contribute more to high throughput sequencing data analysis by providing more accuracy. But the amount of improvement has not yet been quantified. We conducted a study to compare the genomic analysis results between the GRCh38 reference and its predecessor GRCh37. Through analyses of alignment, single nucleotide polymorphisms, small insertion/deletions, copy number and structural variants, we show that GRCh38 offers overall more accurate analysis of human sequencing data. More importantly, GRCh38 produced fewer false positive structural variants. In conclusion, GRCh38 is an improvement over GRCh37 not only from the genome assembly aspect, but also yields more reliable genomic analysis results. Copyright © 2017. Published by Elsevier Inc.

  19. Next generation platforms for high-throughput bio-dosimetry

    International Nuclear Information System (INIS)

    Repin, Mikhail; Turner, Helen C.; Garty, Guy; Brenner, David J.

    2014-01-01

    Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of bio-dosimetry assays was described. These platforms can be used at different stages of bio-dosimetry assays starting from blood collection into micro-tubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multi-well and multichannel plates. Robotically friendly platforms can be used for different bio-dosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. (authors)

  20. Cost benefit analysis cost effectiveness analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data

  1. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  2. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  4. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  5. High-throughput screening to identify inhibitors of lysine demethylases.

    Science.gov (United States)

    Gale, Molly; Yan, Qin

    2015-01-01

    Lysine demethylases (KDMs) are epigenetic regulators whose dysfunction is implicated in the pathology of many human diseases including various types of cancer, inflammation and X-linked intellectual disability. Particular demethylases have been identified as promising therapeutic targets, and tremendous efforts are being devoted toward developing suitable small-molecule inhibitors for clinical and research use. Several High-throughput screening strategies have been developed to screen for small-molecule inhibitors of KDMs, each with advantages and disadvantages in terms of time, cost, effort, reliability and sensitivity. In this Special Report, we review and evaluate the High-throughput screening methods utilized for discovery of novel small-molecule KDM inhibitors.

  6. Gene Expression Analysis of Escherichia Coli Grown in Miniaturized Bioreactor Platforms for High-Throughput Analysis of Growth and genomic Data

    DEFF Research Database (Denmark)

    Boccazzi, P.; Zanzotto, A.; Szita, Nicolas

    2005-01-01

    Combining high-throughput growth physiology and global gene expression data analysis is of significant value for integrating metabolism and genomics. We compared global gene expression using 500 ng of total RNA from Escherichia coli cultures grown in rich or defined minimal media in a miniaturize...... cultures using just 500 ng of total RNA indicate that high-throughput integration of growth physiology and genomics will be possible with novel biochemical platforms and improved detection technologies....

  7. Repurposing a Benchtop Centrifuge for High-Throughput Single-Molecule Force Spectroscopy.

    Science.gov (United States)

    Yang, Darren; Wong, Wesley P

    2018-01-01

    We present high-throughput single-molecule manipulation using a benchtop centrifuge, overcoming limitations common in other single-molecule approaches such as high cost, low throughput, technical difficulty, and strict infrastructure requirements. An inexpensive and compact Centrifuge Force Microscope (CFM) adapted to a commercial centrifuge enables use by nonspecialists, and integration with DNA nanoswitches facilitates both reliable measurements and repeated molecular interrogation. Here, we provide detailed protocols for constructing the CFM, creating DNA nanoswitch samples, and carrying out single-molecule force measurements.

  8. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  9. elegantRingAnalysis An Interface for High-Throughput Analysis of Storage Ring Lattices Using elegant

    CERN Document Server

    Borland, Michael

    2005-01-01

    The code {\\tt elegant} is widely used for simulation of linacs for drivers for free-electron lasers. Less well known is that elegant is also a very capable code for simulation of storage rings. In this paper, we show a newly-developed graphical user interface that allows the user to easily take advantage of these capabilities. The interface is designed for use on a Linux cluster, providing very high throughput. It can also be used on a single computer. Among the features it gives access to are basic calculations (Twiss parameters, radiation integrals), phase-space tracking, nonlinear dispersion, dynamic aperture (on- and off-momentum), frequency map analysis, and collective effects (IBS, bunch-lengthening). Using a cluster, it is easy to get highly detailed dynamic aperture and frequency map results in a surprisingly short time.

  10. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    Science.gov (United States)

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  11. Gold-coated polydimethylsiloxane microwells for high-throughput electrochemiluminescence analysis of intracellular glucose at single cells.

    Science.gov (United States)

    Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng

    2018-06-04

    In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.

  12. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  13. Genome-wide LORE1 retrotransposon mutagenesis and high-throughput insertion detection in Lotus japonicus

    DEFF Research Database (Denmark)

    Urbanski, Dorian Fabian; Malolepszy, Anna; Stougaard, Jens

    2012-01-01

    Insertion mutants facilitate functional analysis of genes, but for most plant species it has been difficult to identify a suitable mutagen and to establish large populations for reverse genetics. The main challenge is developing efficient high-throughput procedures for both mutagenesis and insert......Insertion mutants facilitate functional analysis of genes, but for most plant species it has been difficult to identify a suitable mutagen and to establish large populations for reverse genetics. The main challenge is developing efficient high-throughput procedures for both mutagenesis...... plants. The identified insertions showed that the endogenous LORE1 retrotransposon is well suited for insertion mutagenesis due to its homogenous gene targeting and exonic insertion preference. Since LORE1 transposition occurs in the germline, harvesting seeds from a single founder line and cultivating...... progeny generates a complete mutant population. This ease of LORE1 mutagenesis combined with the efficient FSTpoolit protocol, which exploits 2D pooling, Illumina sequencing, and automated data analysis, allows highly cost-efficient development of a comprehensive reverse genetic resource....

  14. Controlling high-throughput manufacturing at the nano-scale

    Science.gov (United States)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  15. High-throughput evaluation of interactions between biomaterials, proteins and cells using patterned superhydrophobic substrates

    OpenAIRE

    Neto, Ana I.; Custódio, Catarina A.; Wenlong Song; Mano, J. F.

    2011-01-01

    We propose a new low cost platform for high-throughput analysis that permits screening the biological performance of independent combinations of biomaterials, cells and culture media. Patterned superhydrophobic flat substrates with controlled wettable spots are used to produce microarray chips for accelerated multiplexing evaluation. This work was partially supported by Fundação para a Ciência e Tecnologia (FCT) under project PTDC/FIS/68517/2006.

  16. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  17. Cost-effectiveness Analysis of Nutritional Support for the Prevention of Pressure Ulcers in High-Risk Hospitalized Patients.

    Science.gov (United States)

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2016-06-01

    To evaluate the cost-effectiveness of nutritional support compared with standard care in preventing pressure ulcers (PrUs) in high-risk hospitalized patients. An economic model using data from a systematic literature review. A meta-analysis of randomized controlled trials on the efficacy of nutritional support in reducing the incidence of PrUs was conducted. Modeled cohort of hospitalized patients at high risk of developing PrUs and malnutrition simulated during their hospital stay and up to 1 year. Standard care included PrU prevention strategies, such as redistribution surfaces, repositioning, and skin protection strategies, along with standard hospital diet. In addition to the standard care, the intervention group received nutritional support comprising patient education, nutrition goal setting, and the consumption of high-protein supplements. The analysis was from a healthcare payer perspective. Key outcomes of the model included the average costs and quality-adjusted life years. Model results were tested in univariate sensitivity analyses, and decision uncertainty was characterized using a probabilistic sensitivity analysis. Compared with standard care, nutritional support was cost saving at AU $425 per patient and marginally more effective with an average 0.005 quality-adjusted life years gained. The probability of nutritional support being cost-effective was 87%. Nutritional support to prevent PrUs in high-risk hospitalized patients is cost-effective with substantial cost savings predicted. Hospitals should implement the recommendations from the current PrU practice guidelines and offer nutritional support to high-risk patients.

  18. Data reduction for a high-throughput neutron activation analysis system

    International Nuclear Information System (INIS)

    Bowman, W.W.

    1979-01-01

    To analyze samples collected as part of a geochemical survey for the National Uranium Resource Evaluation program, Savannah River Laboratory has installed a high-throughput neutron activation analysis system. As part of that system, computer programs have been developed to reduce raw data to elemental concentrations in two steps. Program RAGS reduces gamma-ray spectra to lists of photopeak energies, peak areas, and statistical errors. Program RICHES determines the elemental concentrations from photopeak and delayed-neutron data, detector efficiencies, analysis parameters (neutron flux and activation, decay, and counting times), and spectrometric and cross-section data from libraries. Both programs have been streamlined for on-line operation with a minicomputer, each requiring approx. 64 kbytes of core. 3 tables

  19. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  20. The French press: a repeatable and high-throughput approach to exercising zebrafish (Danio rerio).

    Science.gov (United States)

    Usui, Takuji; Noble, Daniel W A; O'Dea, Rose E; Fangmeier, Melissa L; Lagisz, Malgorzata; Hesselson, Daniel; Nakagawa, Shinichi

    2018-01-01

    Zebrafish are increasingly used as a vertebrate model organism for various traits including swimming performance, obesity and metabolism, necessitating high-throughput protocols to generate standardized phenotypic information. Here, we propose a novel and cost-effective method for exercising zebrafish, using a coffee plunger and magnetic stirrer. To demonstrate the use of this method, we conducted a pilot experiment to show that this simple system provides repeatable estimates of maximal swim performance (intra-class correlation [ICC] = 0.34-0.41) and observe that exercise training of zebrafish on this system significantly increases their maximum swimming speed. We propose this high-throughput and reproducible system as an alternative to traditional linear chamber systems for exercising zebrafish and similarly sized fishes.

  1. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  2. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  3. High-throughput screening of filamentous fungi using nanoliter-range droplet-based microfluidics

    Science.gov (United States)

    Beneyton, Thomas; Wijaya, I. Putu Mahendra; Postros, Prexilia; Najah, Majdi; Leblond, Pascal; Couvent, Angélique; Mayot, Estelle; Griffiths, Andrew D.; Drevelle, Antoine

    2016-06-01

    Filamentous fungi are an extremely important source of industrial enzymes because of their capacity to secrete large quantities of proteins. Currently, functional screening of fungi is associated with low throughput and high costs, which severely limits the discovery of novel enzymatic activities and better production strains. Here, we describe a nanoliter-range droplet-based microfluidic system specially adapted for the high-throughput sceening (HTS) of large filamentous fungi libraries for secreted enzyme activities. The platform allowed (i) compartmentalization of single spores in ~10 nl droplets, (ii) germination and mycelium growth and (iii) high-throughput sorting of fungi based on enzymatic activity. A 104 clone UV-mutated library of Aspergillus niger was screened based on α-amylase activity in just 90 minutes. Active clones were enriched 196-fold after a single round of microfluidic HTS. The platform is a powerful tool for the development of new production strains with low cost, space and time footprint and should bring enormous benefit for improving the viability of biotechnological processes.

  4. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  5. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    Science.gov (United States)

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  6. Cost-effectiveness analysis and innovation.

    Science.gov (United States)

    Jena, Anupam B; Philipson, Tomas J

    2008-09-01

    While cost-effectiveness (CE) analysis has provided a guide to allocating often scarce resources spent on medical technologies, less emphasis has been placed on the effect of such criteria on the behavior of innovators who make health care technologies available in the first place. A better understanding of the link between innovation and cost-effectiveness analysis is particularly important given the large role of technological change in the growth in health care spending and the growing interest of explicit use of CE thresholds in leading technology adoption in several Westernized countries. We analyze CE analysis in a standard market context, and stress that a technology's cost-effectiveness is closely related to the consumer surplus it generates. Improved CE therefore often clashes with interventions to stimulate producer surplus, such as patents. We derive the inconsistency between technology adoption based on CE analysis and economic efficiency. Indeed, static efficiency, dynamic efficiency, and improved patient health may all be induced by the cost-effectiveness of the technology being at its worst level. As producer appropriation of the social surplus of an innovation is central to the dynamic efficiency that should guide CE adoption criteria, we exemplify how appropriation can be inferred from existing CE estimates. For an illustrative sample of technologies considered, we find that the median technology has an appropriation of about 15%. To the extent that such incentives are deemed either too low or too high compared to dynamically efficient levels, CE thresholds may be appropriately raised or lowered to improve dynamic efficiency.

  7. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening.

    Science.gov (United States)

    Lawton, Zachary E; Traub, Angelica; Fatigante, William L; Mancias, Jose; O'Leary, Adam E; Hall, Seth E; Wieland, Jamie R; Oberacher, Herbert; Gizzi, Michael C; Mulligan, Christopher C

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. Graphical Abstract ᅟ.

  8. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    Science.gov (United States)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  9. A high throughput mechanical screening device for cartilage tissue engineering.

    Science.gov (United States)

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  10. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  11. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  12. High-throughput purification of recombinant proteins using self-cleaving intein tags.

    Science.gov (United States)

    Coolbaugh, M J; Shakalli Tang, M J; Wood, D W

    2017-01-01

    High throughput methods for recombinant protein production using E. coli typically involve the use of affinity tags for simple purification of the protein of interest. One drawback of these techniques is the occasional need for tag removal before study, which can be hard to predict. In this work, we demonstrate two high throughput purification methods for untagged protein targets based on simple and cost-effective self-cleaving intein tags. Two model proteins, E. coli beta-galactosidase (βGal) and superfolder green fluorescent protein (sfGFP), were purified using self-cleaving versions of the conventional chitin-binding domain (CBD) affinity tag and the nonchromatographic elastin-like-polypeptide (ELP) precipitation tag in a 96-well filter plate format. Initial tests with shake flask cultures confirmed that the intein purification scheme could be scaled down, with >90% pure product generated in a single step using both methods. The scheme was then validated in a high throughput expression platform using 24-well plate cultures followed by purification in 96-well plates. For both tags and with both target proteins, the purified product was consistently obtained in a single-step, with low well-to-well and plate-to-plate variability. This simple method thus allows the reproducible production of highly pure untagged recombinant proteins in a convenient microtiter plate format. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. High-throughput selection for cellulase catalysts using chemical complementation.

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T; Lin, Hening; Tao, Haiyan; Cornish, Virginia W

    2008-12-24

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases, however, is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Because of the large number of enzyme variants that selections can now test as compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity.

  14. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  15. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Directory of Open Access Journals (Sweden)

    Karolina Chwialkowska

    2017-11-01

    Full Text Available Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq. We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation

  16. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  17. A simple, high throughput method to locate single copy sequences from Bacterial Artificial Chromosome (BAC libraries using High Resolution Melt analysis

    Directory of Open Access Journals (Sweden)

    Caligari Peter DS

    2010-05-01

    Full Text Available Abstract Background The high-throughput anchoring of genetic markers into contigs is required for many ongoing physical mapping projects. Multidimentional BAC pooling strategies for PCR-based screening of large insert libraries is a widely used alternative to high density filter hybridisation of bacterial colonies. To date, concerns over reliability have led most if not all groups engaged in high throughput physical mapping projects to favour BAC DNA isolation prior to amplification by conventional PCR. Results Here, we report the first combined use of Multiplex Tandem PCR (MT-PCR and High Resolution Melt (HRM analysis on bacterial stocks of BAC library superpools as a means of rapidly anchoring markers to BAC colonies and thereby to integrate genetic and physical maps. We exemplify the approach using a BAC library of the model plant Arabidopsis thaliana. Super pools of twenty five 384-well plates and two-dimension matrix pools of the BAC library were prepared for marker screening. The entire procedure only requires around 3 h to anchor one marker. Conclusions A pre-amplification step during MT-PCR allows high multiplexing and increases the sensitivity and reliability of subsequent HRM discrimination. This simple gel-free protocol is more reliable, faster and far less costly than conventional PCR screening. The option to screen in parallel 3 genetic markers in one MT-PCR-HRM reaction using templates from directly pooled bacterial stocks of BAC-containing bacteria further reduces time for anchoring markers in physical maps of species with large genomes.

  18. High-throughput, temperature-controlled microchannel acoustophoresis device made with rapid prototyping

    DEFF Research Database (Denmark)

    Adams, Jonathan D; Ebbesen, Christian L.; Barnkob, Rune

    2012-01-01

    -slide format using low-cost, rapid-prototyping techniques. This high-throughput acoustophoresis chip (HTAC) utilizes a temperature-stabilized, standing ultrasonic wave, which imposes differential acoustic radiation forces that can separate particles according to size, density and compressibility. The device...

  19. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    Science.gov (United States)

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  20. High-throughput analysis for preparation, processing and analysis of TiO2 coatings on steel by chemical solution deposition

    International Nuclear Information System (INIS)

    Cuadrado Gil, Marcos; Van Driessche, Isabel; Van Gils, Sake; Lommens, Petra; Castelein, Pieter; De Buysser, Klaartje

    2012-01-01

    Highlights: ► High-throughput preparation of TiO 2 aqueous precursors. ► Analysis of stability and surface tension. ► Deposition of TiO 2 coatings. - Abstract: A high-throughput preparation, processing and analysis of titania coatings prepared by chemical solution deposition from water-based precursors at low temperature (≈250 °C) on two different types of steel substrates (Aluzinc® and bright annealed) is presented. The use of the high-throughput equipment allows fast preparation of multiple samples saving time, energy and material; and helps to test the scalability of the process. The process itself includes the use of IR curing for aqueous ceramic precursors and possibilities of using UV irradiation before the final sintering step. The IR curing method permits a much faster curing step compared to normal high temperature treatments in traditional convection devices (i.e., tube furnaces). The formulations, also prepared by high-throughput equipment, are found to be stable in the operational pH range of the substrates (6.5–8.5). Titanium alkoxides itself lack stability in pure water-based environments, but the presence of the different organic complexing agents prevents it from hydrolysis and precipitation reactions. The wetting interaction between the substrates and the various formulations is studied by the determination of the surface free energy of the substrates and the polar and dispersive components of the surface tension of the solutions. The mild temperature program used for preparation of the coatings however does not lead to the formation of pure crystalline material, necessary for the desired photocatalytic and super-hydrophilic behavior of these coatings. Nevertheless, some activity can be reported for these amorphous coatings by monitoring the discoloration of methylene blue in water under UV irradiation.

  1. Dedicated Perioperative Hip Fracture Comanagement Programs are Cost-effective in High-volume Centers: An Economic Analysis.

    Science.gov (United States)

    Swart, Eric; Vasudeva, Eshan; Makhni, Eric C; Macaulay, William; Bozic, Kevin J

    2016-01-01

    a full-time basis. Additionally, we evaluated the scenario where the necessary staff was already employed at the hospital and could be dedicated to a comanagement service on a part-time basis, and explored the effect of triaging only patients considered high risk to a comanagement service versus comanaging all geriatric patients. Finally, probabilistic sensitivity analysis was conducted on all critical variables, with broad ranges used for values around which there was higher uncertainty. For the base case, universal comanagement was more cost effective than traditional care and risk-stratified comanagement (incremental cost effectiveness ratios of USD 41,100 per quality-adjusted life-year and USD 81,900 per quality-adjusted life-year, respectively). Comanagement was more cost effective than traditional management as long as the case volume was more than 54 patients annually (range, 41-68 patients based on sensitivity analysis) and resulted in cost savings when there were more than 318 patients annually (range, 238-397 patients). In a scenario where staff could be partially dedicated to a comanagement service, universal comanagement was more cost effective than risk-stratified comanagement (incremental cost effectiveness of USD 2300 per quality-adjusted life-year), and both comanagement programs had lower costs and better outcomes compared with traditional management. Sensitivity analysis was conducted and showed that the level of uncertainty in key variables was not high enough to change the core conclusions of the model. Implementation of a systems-based comanagement strategy using a dedicated team to improve perioperative medical care and expedite preoperative evaluation is cost effective in hospitals with moderate volume and can result in cost savings at higher-volume centers. The optimum patient population for a comanagement strategy is still being defined. Level 1, Economic and Decision Analysis.

  2. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  3. Diagnostic staging laparoscopy in gastric cancer treatment: A cost-effectiveness analysis.

    Science.gov (United States)

    Li, Kevin; Cannon, John G D; Jiang, Sam Y; Sambare, Tanmaya D; Owens, Douglas K; Bendavid, Eran; Poultsides, George A

    2018-05-01

    Accurate preoperative staging helps avert morbidity, mortality, and cost associated with non-therapeutic laparotomy in gastric cancer (GC) patients. Diagnostic staging laparoscopy (DSL) can detect metastases with high sensitivity, but its cost-effectiveness has not been previously studied. We developed a decision analysis model to assess the cost-effectiveness of preoperative DSL in GC workup. Analysis was based on a hypothetical cohort of GC patients in the U.S. for whom initial imaging shows no metastases. The cost-effectiveness of DSL was measured as cost per quality-adjusted life-year (QALY) gained. Drivers of cost-effectiveness were assessed in sensitivity analysis. Preoperative DSL required an investment of $107 012 per QALY. In sensitivity analysis, DSL became cost-effective at a threshold of $100 000/QALY when the probability of occult metastases exceeded 31.5% or when test sensitivity for metastases exceeded 86.3%. The likelihood of cost-effectiveness increased from 46% to 93% when both parameters were set at maximum reported values. The cost-effectiveness of DSL for GC patients is highly dependent on patient and test characteristics, and is more likely when DSL is used selectively where procedure yield is high, such as for locally advanced disease or in detecting peritoneal and superficial versus deep liver lesions. © 2017 Wiley Periodicals, Inc.

  4. High-throughput mouse genotyping using robotics automation.

    Science.gov (United States)

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  5. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  6. High-throughput characterization methods for lithium batteries

    Directory of Open Access Journals (Sweden)

    Yingchun Lyu

    2017-09-01

    Full Text Available The development of high-performance lithium ion batteries requires the discovery of new materials and the optimization of key components. By contrast with traditional one-by-one method, high-throughput method can synthesize and characterize a large number of compositionally varying samples, which is able to accelerate the pace of discovery, development and optimization process of materials. Because of rapid progress in thin film and automatic control technologies, thousands of compounds with different compositions could be synthesized rapidly right now, even in a single experiment. However, the lack of rapid or combinatorial characterization technologies to match with high-throughput synthesis methods, limit the application of high-throughput technology. Here, we review a series of representative high-throughput characterization methods used in lithium batteries, including high-throughput structural and electrochemical characterization methods and rapid measuring technologies based on synchrotron light sources.

  7. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  8. A high-throughput and quantitative method to assess the mutagenic potential of translesion DNA synthesis

    Science.gov (United States)

    Taggart, David J.; Camerlengo, Terry L.; Harrison, Jason K.; Sherrer, Shanen M.; Kshetry, Ajay K.; Taylor, John-Stephen; Huang, Kun; Suo, Zucai

    2013-01-01

    Cellular genomes are constantly damaged by endogenous and exogenous agents that covalently and structurally modify DNA to produce DNA lesions. Although most lesions are mended by various DNA repair pathways in vivo, a significant number of damage sites persist during genomic replication. Our understanding of the mutagenic outcomes derived from these unrepaired DNA lesions has been hindered by the low throughput of existing sequencing methods. Therefore, we have developed a cost-effective high-throughput short oligonucleotide sequencing assay that uses next-generation DNA sequencing technology for the assessment of the mutagenic profiles of translesion DNA synthesis catalyzed by any error-prone DNA polymerase. The vast amount of sequencing data produced were aligned and quantified by using our novel software. As an example, the high-throughput short oligonucleotide sequencing assay was used to analyze the types and frequencies of mutations upstream, downstream and at a site-specifically placed cis–syn thymidine–thymidine dimer generated individually by three lesion-bypass human Y-family DNA polymerases. PMID:23470999

  9. A High-throughput Selection for Cellulase Catalysts Using Chemical Complementation

    Science.gov (United States)

    Peralta-Yahya, Pamela; Carter, Brian T.; Lin, Hening; Tao, Haiyan; Cornish, Virginia W.

    2010-01-01

    Efficient enzymatic hydrolysis of lignocellulosic material remains one of the major bottlenecks to cost-effective conversion of biomass to ethanol. Improvement of glycosylhydrolases however is limited by existing medium-throughput screening technologies. Here, we report the first high-throughput selection for cellulase catalysts. This selection was developed by adapting chemical complementation to provide a growth assay for bond cleavage reactions. First, a URA3 counter selection was adapted to link chemical dimerizer activated gene transcription to cell death. Next, the URA3 counter selection was shown to detect cellulase activity based on cleavage of a tetrasaccharide chemical dimerizer substrate and decrease in expression of the toxic URA3 reporter. Finally, the utility of the cellulase selection was assessed by isolating cellulases with improved activity from a cellulase library created by family DNA shuffling. This application provides further evidence that chemical complementation can be readily adapted to detect different enzymatic activities for important chemical transformations for which no natural selection exists. Due to the large number of enzyme variants selections can test compared to existing medium-throughput screens for cellulases, this assay has the potential to impact the discovery of improved cellulases and other glycosylhydrolases for biomass conversion from libraries of cellulases created by mutagenesis or obtained from natural biodiversity. PMID:19053460

  10. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  11. High-Throughput Particle Manipulation Based on Hydrodynamic Effects in Microchannels

    Directory of Open Access Journals (Sweden)

    Chao Liu

    2017-03-01

    Full Text Available Microfluidic techniques are effective tools for precise manipulation of particles and cells, whose enrichment and separation is crucial for a wide range of applications in biology, medicine, and chemistry. Recently, lateral particle migration induced by the intrinsic hydrodynamic effects in microchannels, such as inertia and elasticity, has shown its promise for high-throughput and label-free particle manipulation. The particle migration can be engineered to realize the controllable focusing and separation of particles based on a difference in size. The widespread use of inertial and viscoelastic microfluidics depends on the understanding of hydrodynamic effects on particle motion. This review will summarize the progress in the fundamental mechanisms and key applications of inertial and viscoelastic particle manipulation.

  12. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  13. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  14. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  15. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  16. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  17. A high-throughput splinkerette-PCR method for the isolation and sequencing of retroviral insertion sites

    DEFF Research Database (Denmark)

    Uren, Anthony G; Mikkers, Harald; Kool, Jaap

    2009-01-01

    sites has been a major limitation to performing screens on this scale. Here we present a method for the high-throughput isolation of insertion sites using a highly efficient splinkerette-PCR method coupled with capillary or 454 sequencing. This protocol includes a description of the procedure for DNA......Insertional mutagens such as viruses and transposons are a useful tool for performing forward genetic screens in mice to discover cancer genes. These screens are most effective when performed using hundreds of mice; however, until recently, the cost-effective isolation and sequencing of insertion...

  18. Improvement of IBAD-MgO texturing for high throughput of buffered substrate

    International Nuclear Information System (INIS)

    Ito, T.; Takahashi, Y.; Matsuse, K.; Kuriki, R.; Tokumaru, M.; Yoshizumi, M.; Izumi, T.

    2011-01-01

    The requirements from the market on two important factors of performance and cost need to be satisfied for commercialization of the coated conductors. Highly biaxially grain texturing with high production rate should be realized from the perspective of buffer layers processing. IBAD-MgO process is one of the major techniques which are possible to satisfy those requirements. The structure of our buffered substrate is IBS-GZO/IBAD-MgO/RFsputter-LaMnO 3 /PLD-CeO 2 . The PLD-CeO 2 process is the rate limiting and cost dominant one in this architecture. It is proposed that the self-texturing CeO 2 layer thickness could be reduced by optimization of the MgO processing due to higher MgO texturing and/or effective growth of self-texturing CeO 2 . Influence of the IBAD beam conditions and deposition time has been studied to optimize the IBAD conditions. Optimized IBAD conditions were decided from the viewpoints of in-plane grain texturing and the stability to obtain high texturing on fabrication. The Δφ value of CeO 2 layer was improved from 4-5 o to 3-3.5 o by the optimization. This buffered substrate gave high and uniform I c values of 524-565 A/cm-width for 50 m long GdBCO (1.5 μm) tape, indicating uniform distribution of Δφ(CeO 2 ). This improvement of Δφ(CeO 2 ) enables to reduce the CeO 2 thickness down to 300 nm without making Δφ(CeO 2 ) > 5 o , which improves CeO 2 throughput from 10 m/h to 30 m/h. A 50 m long patch sample showed more uniform Δφ distribution around 4 o even by high speed of 30 m/h as CeO 2 through-put. Highly and uniformly textured CeO 2 buffered substrate was obtained in 100 m long cost-effectively by optimization of IBAD-MgO processing.

  19. High-throughput screening of chemical effects on ...

    Science.gov (United States)

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples on steroidogenesis via HPLC-MS/MS quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a three stage screening strategy. The first stage established the maximum tolerated concentration (MTC; >70% viability) per sample. The second stage quantified changes in hormone levels at the MTC while the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were pre-stimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2,060 chemical samples evaluated, 524 samples were selected for six-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into five distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A d

  20. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs.

    Science.gov (United States)

    Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo

    2008-01-23

    To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  1. FIM imaging and FIMtrack: two new tools allowing high-throughput and cost effective locomotion analysis.

    Science.gov (United States)

    Risse, Benjamin; Otto, Nils; Berh, Dimitri; Jiang, Xiaoyi; Klämbt, Christian

    2014-12-24

    The analysis of neuronal network function requires a reliable measurement of behavioral traits. Since the behavior of freely moving animals is variable to a certain degree, many animals have to be analyzed, to obtain statistically significant data. This in turn requires a computer assisted automated quantification of locomotion patterns. To obtain high contrast images of almost translucent and small moving objects, a novel imaging technique based on frustrated total internal reflection called FIM was developed. In this setup, animals are only illuminated with infrared light at the very specific position of contact with the underlying crawling surface. This methodology results in very high contrast images. Subsequently, these high contrast images are processed using established contour tracking algorithms. Based on this, we developed the FIMTrack software, which serves to extract a number of features needed to quantitatively describe a large variety of locomotion characteristics. During the development of this software package, we focused our efforts on an open source architecture allowing the easy addition of further modules. The program operates platform independent and is accompanied by an intuitive GUI guiding the user through data analysis. All locomotion parameter values are given in form of csv files allowing further data analyses. In addition, a Results Viewer integrated into the tracking software provides the opportunity to interactively review and adjust the output, as might be needed during stimulus integration. The power of FIM and FIMTrack is demonstrated by studying the locomotion of Drosophila larvae.

  2. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  3. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces

    Science.gov (United States)

    Abbott, W. W.; Faisal, A. A.

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s-1, more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark—the control of the video arcade game ‘Pong’.

  4. A comparison of high-throughput techniques for assaying circadian rhythms in plants.

    Science.gov (United States)

    Tindall, Andrew J; Waller, Jade; Greenwood, Mark; Gould, Peter D; Hartwell, James; Hall, Anthony

    2015-01-01

    Over the last two decades, the development of high-throughput techniques has enabled us to probe the plant circadian clock, a key coordinator of vital biological processes, in ways previously impossible. With the circadian clock increasingly implicated in key fitness and signalling pathways, this has opened up new avenues for understanding plant development and signalling. Our tool-kit has been constantly improving through continual development and novel techniques that increase throughput, reduce costs and allow higher resolution on the cellular and subcellular levels. With circadian assays becoming more accessible and relevant than ever to researchers, in this paper we offer a review of the techniques currently available before considering the horizons in circadian investigation at ever higher throughputs and resolutions.

  5. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    International Nuclear Information System (INIS)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-01-01

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform

  6. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  7. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Accurate CpG and non-CpG cytosine methylation analysis by high-throughput locus-specific pyrosequencing in plants.

    Science.gov (United States)

    How-Kit, Alexandre; Daunay, Antoine; Mazaleyrat, Nicolas; Busato, Florence; Daviaud, Christian; Teyssier, Emeline; Deleuze, Jean-François; Gallusci, Philippe; Tost, Jörg

    2015-07-01

    Pyrosequencing permits accurate quantification of DNA methylation of specific regions where the proportions of the C/T polymorphism induced by sodium bisulfite treatment of DNA reflects the DNA methylation level. The commercially available high-throughput locus-specific pyrosequencing instruments allow for the simultaneous analysis of 96 samples, but restrict the DNA methylation analysis to CpG dinucleotide sites, which can be limiting in many biological systems. In contrast to mammals where DNA methylation occurs nearly exclusively on CpG dinucleotides, plants genomes harbor DNA methylation also in other sequence contexts including CHG and CHH motives, which cannot be evaluated by these pyrosequencing instruments due to software limitations. Here, we present a complete pipeline for accurate CpG and non-CpG cytosine methylation analysis at single base-resolution using high-throughput locus-specific pyrosequencing. The devised approach includes the design and validation of PCR amplification on bisulfite-treated DNA and pyrosequencing assays as well as the quantification of the methylation level at every cytosine from the raw peak intensities of the Pyrograms by two newly developed Visual Basic Applications. Our method presents accurate and reproducible results as exemplified by the cytosine methylation analysis of the promoter regions of two Tomato genes (NOR and CNR) encoding transcription regulators of fruit ripening during different stages of fruit development. Our results confirmed a significant and temporally coordinated loss of DNA methylation on specific cytosines during the early stages of fruit development in both promoters as previously shown by WGBS. The manuscript describes thus the first high-throughput locus-specific DNA methylation analysis in plants using pyrosequencing.

  9. OptoDyCE: Automated system for high-throughput all-optical dynamic cardiac electrophysiology

    Science.gov (United States)

    Klimas, Aleksandra; Yu, Jinzhu; Ambrosi, Christina M.; Williams, John C.; Bien, Harold; Entcheva, Emilia

    2016-02-01

    In the last two decades, market were due to cardiac toxicity, where unintended interactions with ion channels disrupt the heart's normal electrical function. Consequently, all new drugs must undergo preclinical testing for cardiac liability, adding to an already expensive and lengthy process. Recognition that proarrhythmic effects often result from drug action on multiple ion channels demonstrates a need for integrative and comprehensive measurements. Additionally, patient-specific therapies relying on emerging technologies employing stem-cell derived cardiomyocytes (e.g. induced pluripotent stem-cell-derived cardiomyocytes, iPSC-CMs) require better screening methods to become practical. However, a high-throughput, cost-effective approach for cellular cardiac electrophysiology has not been feasible. Optical techniques for manipulation and recording provide a contactless means of dynamic, high-throughput testing of cells and tissues. Here, we consider the requirements for all-optical electrophysiology for drug testing, and we implement and validate OptoDyCE, a fully automated system for all-optical cardiac electrophysiology. We demonstrate the high-throughput capabilities using multicellular samples in 96-well format by combining optogenetic actuation with simultaneous fast high-resolution optical sensing of voltage or intracellular calcium. The system can also be implemented using iPSC-CMs and other cell-types by delivery of optogenetic drivers, or through the modular use of dedicated light-sensitive somatic cells in conjunction with non-modified cells. OptoDyCE provides a truly modular and dynamic screening system, capable of fully-automated acquisition of high-content information integral for improved discovery and development of new drugs and biologics, as well as providing a means of better understanding of electrical disturbances in the heart.

  10. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  11. High-Throughput Fabrication of Nanocone Substrates through Polymer Injection Moulding For SERS Analysis in Microfluidic Systems

    DEFF Research Database (Denmark)

    Viehrig, Marlitt; Matteucci, Marco; Thilsted, Anil H.

    analysis. Metal-capped silicon nanopillars, fabricated through a maskless ion etch, are state-of-the-art for on-chip SERS substrates. A dense cluster of high aspect ratio polymer nanocones was achieved by using high-throughput polymer injection moulding over a large area replicating a silicon nanopillar...... structure. Gold-capped polymer nanocones display similar SERS sensitivity as silicon nanopillars, while being easily integrable into a microfluidic chips....

  12. A high-throughput method for GMO multi-detection using a microfluidic dynamic array

    NARCIS (Netherlands)

    Brod, F.C.A.; Dijk, van J.P.; Voorhuijzen, M.M.; Dinon, A.Z.; Guimarães, L.H.S.; Scholtens, I.M.J.; Arisi, A.C.M.; Kok, E.J.

    2014-01-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNAbased methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the

  13. The application of the high throughput sequencing technology in the transposable elements.

    Science.gov (United States)

    Liu, Zhen; Xu, Jian-hong

    2015-09-01

    High throughput sequencing technology has dramatically improved the efficiency of DNA sequencing, and decreased the costs to a great extent. Meanwhile, this technology usually has advantages of better specificity, higher sensitivity and accuracy. Therefore, it has been applied to the research on genetic variations, transcriptomics and epigenomics. Recently, this technology has been widely employed in the studies of transposable elements and has achieved fruitful results. In this review, we summarize the application of high throughput sequencing technology in the fields of transposable elements, including the estimation of transposon content, preference of target sites and distribution, insertion polymorphism and population frequency, identification of rare copies, transposon horizontal transfers as well as transposon tagging. We also briefly introduce the major common sequencing strategies and algorithms, their advantages and disadvantages, and the corresponding solutions. Finally, we envision the developing trends of high throughput sequencing technology, especially the third generation sequencing technology, and its application in transposon studies in the future, hopefully providing a comprehensive understanding and reference for related scientific researchers.

  14. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  15. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    Science.gov (United States)

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  16. Probing biolabels for high throughput biosensing via synchrotron radiation SEIRA technique

    Energy Technology Data Exchange (ETDEWEB)

    Hornemann, Andrea, E-mail: andrea.hornemann@ptb.de; Hoehl, Arne, E-mail: arne.hoehl@ptb.de; Ulm, Gerhard, E-mail: gerhard.ulm@ptb.de; Beckhoff, Burkhard, E-mail: burkhard.beckhoff@ptb.de [Physikalisch-Technische Bundesanstalt, Abbestr. 2-12, 10587 Berlin (Germany); Eichert, Diane, E-mail: diane.eichert@elettra.eu [Elettra-Sincrotrone Trieste S.C.p.A., Strada Statale 14, Area Science Park, 34149 Trieste (Italy); Flemig, Sabine, E-mail: sabine.flemig@bam.de [BAM Bundesanstalt für Materialforschung und –prüfung, Richard-Willstätter-Str.10, 12489 Berlin (Germany)

    2016-07-27

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzed by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.

  17. Probing biolabels for high throughput biosensing via synchrotron radiation SEIRA technique

    International Nuclear Information System (INIS)

    Hornemann, Andrea; Hoehl, Arne; Ulm, Gerhard; Beckhoff, Burkhard; Eichert, Diane; Flemig, Sabine

    2016-01-01

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzed by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.

  18. Thin film solar modules: the low cost, high throughput and versatile alternative to Si wafers

    Energy Technology Data Exchange (ETDEWEB)

    Hegedus, S. [Delaware Univ., Inst. of Energy Conversion, Newark, DE (United States)

    2006-07-01

    Thin film solar cells (TFSC) have passed adolescence and are ready to make a substantial contribution to the world's electricity generation. They can have advantages over c-Si solar modules in ease of large area, lower cost manufacturing and in several types of applications. Factors which limit TFSC module performance relative to champion cell performance are discussed along with the importance of increased throughput and yield. The consensus of several studies is that all TFSC can achieve costs below 1 $/W if manufactured at sufficiently large scale >100 MW using parallel lines of cloned equipment with high material utilization and spray-on encapsulants. There is significant new commercial interest in TFSC from small investors and large corporations, validating the thin film approach. Unique characteristics are discussed which give TFSC an advantage over c-Si in two specific markets: small rural solar home systems and building integrated photovoltaic installations. TFSC have outperformed c-Si in annual energy production (kWhrs/kW), have demonstrated outdoor durability comparable to c-Si and are being used in MW scale installations worldwide. The merits of the thin film approach cannot be judged on the basis of efficiency alone but must also account for module performance and potential for low cost. TFSC advocates should promote their unique virtues compared to c-Si: lower cost, higher kWhr/kW output, higher battery charging current, attractive visual appearance, flexible substrates, long-term stability comparable to c-Si, and multiple pathways for deposition with room for innovation and evolutionary improvement. There is a huge market for TFSC even at today's efficiency if costs can be reduced. A brief window of opportunity exists for TFSC over the next few years due the Si shortage. The demonstrated capabilities and advantages of TFSC must be proclaimed more persistently to funding decision-makers and customers without minimizing the remaining

  19. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  20. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  1. Improvement of IBAD-MgO texturing for high throughput of buffered substrate

    Energy Technology Data Exchange (ETDEWEB)

    Ito, T., E-mail: t-ito@istec.or.jp [Superconductivity Research Laboratory, ISTEC, 1-10-13, Shinonome, Koto-ku, Tokyo 135-0062 (Japan); Takahashi, Y.; Matsuse, K.; Kuriki, R.; Tokumaru, M.; Yoshizumi, M.; Izumi, T. [Superconductivity Research Laboratory, ISTEC, 1-10-13, Shinonome, Koto-ku, Tokyo 135-0062 (Japan)

    2011-11-15

    The requirements from the market on two important factors of performance and cost need to be satisfied for commercialization of the coated conductors. Highly biaxially grain texturing with high production rate should be realized from the perspective of buffer layers processing. IBAD-MgO process is one of the major techniques which are possible to satisfy those requirements. The structure of our buffered substrate is IBS-GZO/IBAD-MgO/RFsputter-LaMnO{sub 3}/PLD-CeO{sub 2}. The PLD-CeO{sub 2} process is the rate limiting and cost dominant one in this architecture. It is proposed that the self-texturing CeO{sub 2} layer thickness could be reduced by optimization of the MgO processing due to higher MgO texturing and/or effective growth of self-texturing CeO{sub 2}. Influence of the IBAD beam conditions and deposition time has been studied to optimize the IBAD conditions. Optimized IBAD conditions were decided from the viewpoints of in-plane grain texturing and the stability to obtain high texturing on fabrication. The {Delta}{phi} value of CeO{sub 2} layer was improved from 4-5{sup o} to 3-3.5{sup o} by the optimization. This buffered substrate gave high and uniform I{sub c} values of 524-565 A/cm-width for 50 m long GdBCO (1.5 {mu}m) tape, indicating uniform distribution of {Delta}{phi}(CeO{sub 2}). This improvement of {Delta}{phi}(CeO{sub 2}) enables to reduce the CeO{sub 2} thickness down to 300 nm without making {Delta}{phi}(CeO{sub 2}) > 5{sup o}, which improves CeO{sub 2} throughput from 10 m/h to 30 m/h. A 50 m long patch sample showed more uniform {Delta}{phi} distribution around 4{sup o} even by high speed of 30 m/h as CeO{sub 2} through-put. Highly and uniformly textured CeO{sub 2} buffered substrate was obtained in 100 m long cost-effectively by optimization of IBAD-MgO processing.

  2. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Craig A Gedye

    Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell

  3. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Science.gov (United States)

    Gedye, Craig A; Hussain, Ali; Paterson, Joshua; Smrke, Alannah; Saini, Harleen; Sirskyj, Danylo; Pereira, Keira; Lobo, Nazleen; Stewart, Jocelyn; Go, Christopher; Ho, Jenny; Medrano, Mauricio; Hyatt, Elzbieta; Yuan, Julie; Lauriault, Stevan; Meyer, Mona; Kondratyev, Maria; van den Beucken, Twan; Jewett, Michael; Dirks, Peter; Guidos, Cynthia J; Danska, Jayne; Wang, Jean; Wouters, Bradly; Neel, Benjamin; Rottapel, Robert; Ailles, Laurie E

    2014-01-01

    Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC) platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell markers.

  4. A high volume cost efficient production macrostructuring process. [for silicon solar cell surface treatment

    Science.gov (United States)

    Chitre, S. R.

    1978-01-01

    The paper presents an experimentally developed surface macro-structuring process suitable for high volume production of silicon solar cells. The process lends itself easily to automation for high throughput to meet low-cost solar array goals. The tetrahedron structure observed is 0.5 - 12 micron high. The surface has minimal pitting with virtually no or very few undeveloped areas across the surface. This process has been developed for (100) oriented as cut silicon. Chemi-etched, hydrophobic and lapped surfaces were successfully texturized. A cost analysis as per Samics is presented.

  5. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  6. A high throughput data acquisition and processing model for applications based on GPUs

    International Nuclear Information System (INIS)

    Nieto, J.; Arcas, G. de; Ruiz, M.; Castro, R.; Vega, J.; Guillen, P.

    2015-01-01

    Highlights: • Implementation of a direct communication path between a data acquisition NI FlexRIO device and a NVIDIA GPU device. • Customization of a Linux Kernel Open Driver (NI FlexRIO) and a C API Interface for work con NVIDIA RDMA GPUDirect. • Performance evaluation with respect to traditional model that use CPU as buffer data allocation. - Abstract: There is an increasing interest in the use of GPU technologies for real time analysis in fusion devices. The availability of high bandwidth interfaces has made them a very cost effective alternative not only for high volume data analysis or simulation, and commercial products are available for some interest areas. However from the point of view of their application in real time scenarios, there are still some issues under analysis, such as the possibility to improve the data throughput inside a discrete system consisting of data acquisition devices (DAQ) and GPUs. This paper addresses the possibility of using peer to peer data communication between DAQ devices and GPUs sharing the same PCIexpress bus to implement continuous real time acquisition and processing systems where data transfers require minimum CPU intervention. This technology eliminates unnecessary system memory copies and lowers CPU overhead, avoiding bottleneck when the system uses the main system memory.

  7. A high throughput data acquisition and processing model for applications based on GPUs

    Energy Technology Data Exchange (ETDEWEB)

    Nieto, J., E-mail: jnieto@sec.upm.es [Instrumentation and Applied Acoustic Research Group, Technical University of Madrid (UPM), Madrid (Spain); Arcas, G. de; Ruiz, M. [Instrumentation and Applied Acoustic Research Group, Technical University of Madrid (UPM), Madrid (Spain); Castro, R.; Vega, J. [Data acquisition Group EURATOM/CIEMAT Association for Fusion, Madrid (Spain); Guillen, P. [Instrumentation and Applied Acoustic Research Group, Technical University of Madrid (UPM), Madrid (Spain)

    2015-10-15

    Highlights: • Implementation of a direct communication path between a data acquisition NI FlexRIO device and a NVIDIA GPU device. • Customization of a Linux Kernel Open Driver (NI FlexRIO) and a C API Interface for work con NVIDIA RDMA GPUDirect. • Performance evaluation with respect to traditional model that use CPU as buffer data allocation. - Abstract: There is an increasing interest in the use of GPU technologies for real time analysis in fusion devices. The availability of high bandwidth interfaces has made them a very cost effective alternative not only for high volume data analysis or simulation, and commercial products are available for some interest areas. However from the point of view of their application in real time scenarios, there are still some issues under analysis, such as the possibility to improve the data throughput inside a discrete system consisting of data acquisition devices (DAQ) and GPUs. This paper addresses the possibility of using peer to peer data communication between DAQ devices and GPUs sharing the same PCIexpress bus to implement continuous real time acquisition and processing systems where data transfers require minimum CPU intervention. This technology eliminates unnecessary system memory copies and lowers CPU overhead, avoiding bottleneck when the system uses the main system memory.

  8. Performance evaluation of hybrid VLC using device cost and power over data throughput criteria

    Science.gov (United States)

    Lee, C. C.; Tan, C. S.; Wong, H. Y.; Yahya, M. B.

    2013-09-01

    Visible light communication (VLC) technology has attained its attention in both academic and industry lately. It is determined by the development of light emitting diode (LED) technology for solid-state lighting (SSL).It has great potential to gradually replace radio frequency (RF) wireless technology because it offers unregulated and unlicensed bandwidth to withstand future demand of indoor wireless access to real-time bandwidth-demanding applications. However, it was found to provide intrusive uplink channel that give rise to unpleasant irradiance from the user device which could interfere with the downlink channel of VLC and hence limit mobility to users as a result of small coverage (field of view of VLC).To address this potential problem, a Hybrid VLC system which integrates VLC (for downlink) and RF (for uplink) technology is proposed. It offers a non-intrusive RF back channel that provides high throughput VLC and maintains durability with conventional RF devices. To deploy Hybrid VLC system in the market, it must be energy and cost saving to attain its equivalent economical advantage by comparing to existing architecture that employs fluorescent or LED lights with RF technology. In this paper, performance evaluation on the proposed hybrid system was carried out in terms of device cost and power consumption against data throughput. Based on our simulation, Hybrid VLC system was found to reduce device cost by 3% and power consumption by 68% when compares to fluorescent lights with RF technology. Nevertheless, when it is compared to LED lights with RF technology, our proposed hybrid system is found to achieve device cost saving as high as 47% and reduced power consumption by 49%. Such promising results have demonstrated that Hybrid VLC system is a feasible solution and has paved the way for greater cost saving and energy efficient compares with the current RF architecture even with the increasing requirement of indoor area coverage.

  9. A high-throughput method for GMO multi-detection using a microfluidic dynamic array.

    Science.gov (United States)

    Brod, Fábio Cristiano Angonesi; van Dijk, Jeroen P; Voorhuijzen, Marleen M; Dinon, Andréia Zilio; Guimarães, Luis Henrique S; Scholtens, Ingrid M J; Arisi, Ana Carolina Maisonnave; Kok, Esther J

    2014-02-01

    The ever-increasing production of genetically modified crops generates a demand for high-throughput DNA-based methods for the enforcement of genetically modified organisms (GMO) labelling requirements. The application of standard real-time PCR will become increasingly costly with the growth of the number of GMOs that is potentially present in an individual sample. The present work presents the results of an innovative approach in genetically modified crops analysis by DNA based methods, which is the use of a microfluidic dynamic array as a high throughput multi-detection system. In order to evaluate the system, six test samples with an increasing degree of complexity were prepared, preamplified and subsequently analysed in the Fluidigm system. Twenty-eight assays targeting different DNA elements, GM events and species-specific reference genes were used in the experiment. The large majority of the assays tested presented expected results. The power of low level detection was assessed and elements present at concentrations as low as 0.06 % were successfully detected. The approach proposed in this work presents the Fluidigm system as a suitable and promising platform for GMO multi-detection.

  10. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  11. High-throughput screening of effective siRNAs using luciferase-linked chimeric mRNA.

    Directory of Open Access Journals (Sweden)

    Shen Pang

    Full Text Available The use of siRNAs to knock down gene expression can potentially be an approach to treat various diseases. To avoid siRNA toxicity the less transcriptionally active H1 pol III promoter, rather than the U6 promoter, was proposed for siRNA expression. To identify highly efficacious siRNA sequences, extensive screening is required, since current computer programs may not render ideal results. Here, we used CCR5 gene silencing as a model to investigate a rapid and efficient screening approach. We constructed a chimeric luciferase-CCR5 gene for high-throughput screening of siRNA libraries. After screening approximately 900 shRNA clones, 12 siRNA sequences were identified. Sequence analysis demonstrated that most (11 of the 12 sequences of these siRNAs did not match those identified by available siRNA prediction algorithms. Significant inhibition of CCR5 in a T-lymphocyte cell line and primary T cells by these identified siRNAs was confirmed using the siRNA lentiviral vectors to infect these cells. The inhibition of CCR5 expression significantly protected cells from R5 HIV-1JRCSF infection. These results indicated that the high-throughput screening method allows efficient identification of siRNA sequences to inhibit the target genes at low levels of expression.

  12. Evaluation of Simple and Inexpensive High-Throughput Methods for Phytic Acid Determination

    DEFF Research Database (Denmark)

    Raboy, Victor; Johnson, Amy; Bilyeu, Kristin

    2017-01-01

    High-throughput/low-cost/low-tech methods for phytic acid determination that are sufficiently accurate and reproducible would be of value in plant genetics, crop breeding and in the food and feed industries. Variants of two candidate methods, those described by Vaintraub and Lapteva (Anal Biochem...... and legume flours regardless of endogenous phytic acid levels or matrix constituents....

  13. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  14. Multilayer Porous Crucibles for the High Throughput Salt Separation from Uranium Deposits

    International Nuclear Information System (INIS)

    Kwon, S. W.; Park, K. M.; Kim, J. G.; Kim, I. T.; Seo, B. K.; Moon, J. G.

    2013-01-01

    Solid cathode processing is necessary to separate the salt from the cathode since the uranium deposit in a solid cathode contains electrolyte salt. A physical separation process, such as a distillation separation, is more attractive than a chemical or dissolution process because physical processes generate much less secondary process. Distillation process was employed for the cathode processsing due to the advantages of minimal generation of secondary waste, compact unit process, simple and low cost equipment. The basis for vacuum distillation separation is the difference in vapor pressures between salt and uranium. A solid cathode deposit is heated in a heating region and salt vaporizes, while nonvolatile uranium remains behind. It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites. The evaporation rate of the LiCl-KCl eutectic salt in vacuum distiller is not so high to come up with the generation capacity of uranium dendrites in an electro-refiner. Therefore, a wide evaporation area or high distillation temperature is necessary for the successful salt separation. In this study, it was attempted to enlarge a throughput of the salt distiller with a multilayer porous crucibles for the separation of adhered salt in the uranium deposits generated from the electrorefiner. The feasibility of the porous crucibles was tested by the salt distillation experiments. In this study, the salt distiller with multilayer porous crucibles was proposed and the feasibility of liquid salt separation was examined to increase a throughput. It was found that the effective separation of salt from uranium deposits was possible by the multilayer porous crucibles

  15. Multilayer Porous Crucibles for the High Throughput Salt Separation from Uranium Deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S. W.; Park, K. M.; Kim, J. G.; Kim, I. T.; Seo, B. K.; Moon, J. G. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-05-15

    Solid cathode processing is necessary to separate the salt from the cathode since the uranium deposit in a solid cathode contains electrolyte salt. A physical separation process, such as a distillation separation, is more attractive than a chemical or dissolution process because physical processes generate much less secondary process. Distillation process was employed for the cathode processsing due to the advantages of minimal generation of secondary waste, compact unit process, simple and low cost equipment. The basis for vacuum distillation separation is the difference in vapor pressures between salt and uranium. A solid cathode deposit is heated in a heating region and salt vaporizes, while nonvolatile uranium remains behind. It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites. The evaporation rate of the LiCl-KCl eutectic salt in vacuum distiller is not so high to come up with the generation capacity of uranium dendrites in an electro-refiner. Therefore, a wide evaporation area or high distillation temperature is necessary for the successful salt separation. In this study, it was attempted to enlarge a throughput of the salt distiller with a multilayer porous crucibles for the separation of adhered salt in the uranium deposits generated from the electrorefiner. The feasibility of the porous crucibles was tested by the salt distillation experiments. In this study, the salt distiller with multilayer porous crucibles was proposed and the feasibility of liquid salt separation was examined to increase a throughput. It was found that the effective separation of salt from uranium deposits was possible by the multilayer porous crucibles.

  16. Drosophila melanogaster as a High-Throughput Model for Host–Microbiota Interactions

    Directory of Open Access Journals (Sweden)

    Gregor Reid

    2017-04-01

    Full Text Available Microbiota research often assumes that differences in abundance and identity of microorganisms have unique influences on host physiology. To test this concept mechanistically, germ-free mice are colonized with microbial communities to assess causation. Due to the cost, infrastructure challenges, and time-consuming nature of germ-free mouse models, an alternative approach is needed to investigate host–microbial interactions. Drosophila melanogaster (fruit flies can be used as a high throughput in vivo screening model of host–microbiome interactions as they are affordable, convenient, and replicable. D. melanogaster were essential in discovering components of the innate immune response to pathogens. However, axenic D. melanogaster can easily be generated for microbiome studies without the need for ethical considerations. The simplified microbiota structure enables researchers to evaluate permutations of how each microbial species within the microbiota contribute to host phenotypes of interest. This enables the possibility of thorough strain-level analysis of host and microbial properties relevant to physiological outcomes. Moreover, a wide range of mutant D. melanogaster strains can be affordably obtained from public stock centers. Given this, D. melanogaster can be used to identify candidate mechanisms of host–microbe symbioses relevant to pathogen exclusion, innate immunity modulation, diet, xenobiotics, and probiotic/prebiotic properties in a high throughput manner. This perspective comments on the most promising areas of microbiota research that could immediately benefit from using the D. melanogaster model.

  17. Drosophila melanogaster as a High-Throughput Model for Host-Microbiota Interactions.

    Science.gov (United States)

    Trinder, Mark; Daisley, Brendan A; Dube, Josh S; Reid, Gregor

    2017-01-01

    Microbiota research often assumes that differences in abundance and identity of microorganisms have unique influences on host physiology. To test this concept mechanistically, germ-free mice are colonized with microbial communities to assess causation. Due to the cost, infrastructure challenges, and time-consuming nature of germ-free mouse models, an alternative approach is needed to investigate host-microbial interactions. Drosophila melanogaster (fruit flies) can be used as a high throughput in vivo screening model of host-microbiome interactions as they are affordable, convenient, and replicable. D. melanogaster were essential in discovering components of the innate immune response to pathogens. However, axenic D. melanogaster can easily be generated for microbiome studies without the need for ethical considerations. The simplified microbiota structure enables researchers to evaluate permutations of how each microbial species within the microbiota contribute to host phenotypes of interest. This enables the possibility of thorough strain-level analysis of host and microbial properties relevant to physiological outcomes. Moreover, a wide range of mutant D. melanogaster strains can be affordably obtained from public stock centers. Given this, D. melanogaster can be used to identify candidate mechanisms of host-microbe symbioses relevant to pathogen exclusion, innate immunity modulation, diet, xenobiotics, and probiotic/prebiotic properties in a high throughput manner. This perspective comments on the most promising areas of microbiota research that could immediately benefit from using the D. melanogaster model.

  18. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  19. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  20. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  1. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  2. Modeling Steroidogenesis Disruption Using High-Throughput ...

    Science.gov (United States)

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  3. Low-cost activation analysis at small research reactors

    CERN Document Server

    Westphal, G P; Lemmel, H; Niedermaier, M R; Joestl, K; Schröder, P; Böck, H H; Schachner, H; Klapfer, E

    2003-01-01

    A software implementation of a loss-free counting multichannel analyzer, storing immediately into the multimegabyte memory of a low-cost 486 or Pentium type PC, enables the real-time control of a rabbit system as well as the collection of up to 1000 pairs of simultaneously recorded loss-corrected and non-corrected spectra of 16 k channels each, in a true sequence without time gaps in between, at throughput rates of up to 200 kc/s. Intended for activation analysis of short-lived isomeric transitions, the system renders possible peak to background optimizations and separations of lines with different half-lives without an a priori knowledge of sample composition by summing up appropriate numbers of spectra over appropriate intervals of time. By automatically adapting the noise filtering time to individual pulse intervals, the Preloaded Digital Filter (PLDF) combines low- to medium-rate resolutions comparable to those of high-quality Gaussian amplifiers with throughput rates of up to 100 kc/s, and high-rate reso...

  4. High-throughput technology for novel SO2 oxidation catalysts

    International Nuclear Information System (INIS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO 2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO 2 to SO 3 . High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO 2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO 2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO 3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. (topical review)

  5. Development of a method for efficient cost-effective screening of Aspergillus niger mutants having increased production of glucoamylase.

    Science.gov (United States)

    Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping

    2017-05-01

    To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3  U ml -1 , a 70% higher yield of glucoamylase than its parent strain.

  6. Quartz-Seq2: a high-throughput single-cell RNA-sequencing method that effectively uses limited sequence reads.

    Science.gov (United States)

    Sasagawa, Yohei; Danno, Hiroki; Takada, Hitomi; Ebisawa, Masashi; Tanaka, Kaori; Hayashi, Tetsutaro; Kurisaki, Akira; Nikaido, Itoshi

    2018-03-09

    High-throughput single-cell RNA-seq methods assign limited unique molecular identifier (UMI) counts as gene expression values to single cells from shallow sequence reads and detect limited gene counts. We thus developed a high-throughput single-cell RNA-seq method, Quartz-Seq2, to overcome these issues. Our improvements in the reaction steps make it possible to effectively convert initial reads to UMI counts, at a rate of 30-50%, and detect more genes. To demonstrate the power of Quartz-Seq2, we analyzed approximately 10,000 transcriptomes from in vitro embryonic stem cells and an in vivo stromal vascular fraction with a limited number of reads.

  7. Annual Report, Fall 2016: Identifying Cost Effective Tank Waste Characterization Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Reboul, S. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); DiPrete, D. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-12-12

    This report documents the activities that were performed during the second year of a project undertaken to improve the cost effectiveness and timeliness of SRNL’s tank closure characterization practices. The activities performed during the first year of the project were previously reported in SRNL-STI-2015-00144. The scope of the second year activities was divided into the following three primary tasks: 1) develop a technical basis and strategy for improving the cost effectiveness and schedule of SRNL’s tank closure characterization program; 2) initiate the design and assembly of a new waste removal system for improving the throughput and reducing the personnel dose associated with extraction chromatography radiochemical separations; and 3) develop and perform feasibility testing of three alternative radiochemical separation protocols holding promise for improving high resource demand/time consuming tank closure sample analysis methods.

  8. Solion ion source for high-efficiency, high-throughput solar cell manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Koo, John, E-mail: john-koo@amat.com; Binns, Brant; Miller, Timothy; Krause, Stephen; Skinner, Wesley; Mullin, James [Applied Materials, Inc., Varian Semiconductor Equipment Business Unit, 35 Dory Road, Gloucester, Massachusetts 01930 (United States)

    2014-02-15

    In this paper, we introduce the Solion ion source for high-throughput solar cell doping. As the source power is increased to enable higher throughput, negative effects degrade the lifetime of the plasma chamber and the extraction electrodes. In order to improve efficiency, we have explored a wide range of electron energies and determined the conditions which best suit production. To extend the lifetime of the source we have developed an in situ cleaning method using only existing hardware. With these combinations, source life-times of >200 h for phosphorous and >100 h for boron ion beams have been achieved while maintaining 1100 cell-per-hour production.

  9. Screening for methicillin-resistant Staphylococcus aureus in clinical swabs using a high-throughput real-time PCR-based method

    DEFF Research Database (Denmark)

    Ornskov, D; Kolmos, B; Bendix Horn, P

    2008-01-01

    2005, all patients and healthcare personnel have been screened for MRSA colonisation, involving analysis of 300-400 samples daily. To deal with this number of samples, a PCR-based method customised for high-throughput analysis and a system for fast reporting of MRSA carrier status were developed. Swab...... samples were incubated overnight in a selective tryptone soya broth and were analysed by PCR the following day. Using this strategy, non-colonised individuals were identified within 24 h, while MRSA-positive samples were analysed further by traditional microbiological methods to determine the resistance...... pattern. This is a cost-effective approach, as the greatest expense in hospitals involves the isolation of patients of unknown MRSA status. The method was evaluated by testing 2194 clinical samples, with a sensitivity and specificity of 100% and 94%, respectively. The analytical sensitivity was 97...

  10. Cytotoxicity Test Based on Human Cells Labeled with Fluorescent Proteins: Fluorimetry, Photography, and Scanning for High-Throughput Assay.

    Science.gov (United States)

    Kalinina, Marina A; Skvortsov, Dmitry A; Rubtsova, Maria P; Komarova, Ekaterina S; Dontsova, Olga A

    2018-06-01

    High- and medium-throughput assays are now routine methods for drug screening and toxicology investigations on mammalian cells. However, a simple and cost-effective analysis of cytotoxicity that can be carried out with commonly used laboratory equipment is still required. The developed cytotoxicity assays are based on human cell lines stably expressing eGFP, tdTomato, mCherry, or Katushka2S fluorescent proteins. Red fluorescent proteins exhibit a higher signal-to-noise ratio, due to less interference by medium autofluorescence, in comparison to green fluorescent protein. Measurements have been performed on a fluorescence scanner, a plate fluorimeter, and a camera photodocumentation system. For a 96-well plate assay, the sensitivity per well and the measurement duration were 250 cells and 15 min for the scanner, 500 cells and 2 min for the plate fluorimeter, and 1000 cells and less than 1 min for the camera detection. These sensitivities are similar to commonly used MTT (tetrazolium dye) assays. The used scanner and the camera had not been previously applied for cytotoxicity evaluation. An image processing scheme for the high-resolution scanner is proposed that significantly diminishes the number of control wells, even for a library containing fluorescent substances. The suggested cytotoxicity assay has been verified by measurements of the cytotoxicity of several well-known cytotoxic drugs and further applied to test a set of novel bacteriotoxic compounds in a medium-throughput format. The fluorescent signal of living cells is detected without disturbing them and adding any reagents, thus allowing to investigate time-dependent cytotoxicity effects on the same sample of cells. A fast, simple and cost-effective assay is suggested for cytotoxicity evaluation based on mammalian cells expressing fluorescent proteins and commonly used laboratory equipment.

  11. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    Science.gov (United States)

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  12. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  13. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  14. WormSizer: high-throughput analysis of nematode size and shape.

    Directory of Open Access Journals (Sweden)

    Brad T Moore

    Full Text Available The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy mutant is short and fat and that a Long (Lon mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc and Roller (Rol mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.

  15. Cost-effectiveness analysis of sandhill crane habitat management

    Science.gov (United States)

    Kessler, Andrew C.; Merchant, James W.; Shultz, Steven D.; Allen, Craig R.

    2013-01-01

    Invasive species often threaten native wildlife populations and strain the budgets of agencies charged with wildlife management. We demonstrate the potential of cost-effectiveness analysis to improve the efficiency and value of efforts to enhance sandhill crane (Grus canadensis) roosting habitat. We focus on the central Platte River in Nebraska (USA), a region of international ecological importance for migrating avian species including sandhill cranes. Cost-effectiveness analysis is a valuation process designed to compare alternative actions based on the cost of achieving a pre-determined objective. We estimated costs for removal of invasive vegetation using geographic information system simulations and calculated benefits as the increase in area of sandhill crane roosting habitat. We generated cost effectiveness values for removing invasive vegetation on 7 land parcels and for the entire central Platte River to compare the cost-effectiveness of management at specific sites and for the central Platte River landscape. Median cost effectiveness values for the 7 land parcels evaluated suggest that costs for creating 1 additional hectare of sandhill crane roosting habitat totaled US $1,595. By contrast, we found that creating an additional hectare of sandhill crane roosting habitat could cost as much as US $12,010 for some areas in the central Platte River, indicating substantial cost savings can be achieved by using a cost effectiveness analysis to target specific land parcels for management. Cost-effectiveness analysis, used in conjunction with geographic information systems, can provide decision-makers with a new tool for identifying the most economically efficient allocation of resources to achieve habitat management goals.

  16. Nonlinear mixed effects dose response modeling in high throughput drug screens: application to melanoma cell line analysis.

    Science.gov (United States)

    Ding, Kuan-Fu; Petricoin, Emanuel F; Finlay, Darren; Yin, Hongwei; Hendricks, William P D; Sereduk, Chris; Kiefer, Jeffrey; Sekulic, Aleksandar; LoRusso, Patricia M; Vuori, Kristiina; Trent, Jeffrey M; Schork, Nicholas J

    2018-01-12

    Cancer cell lines are often used in high throughput drug screens (HTS) to explore the relationship between cell line characteristics and responsiveness to different therapies. Many current analysis methods infer relationships by focusing on one aspect of cell line drug-specific dose-response curves (DRCs), the concentration causing 50% inhibition of a phenotypic endpoint (IC 50 ). Such methods may overlook DRC features and do not simultaneously leverage information about drug response patterns across cell lines, potentially increasing false positive and negative rates in drug response associations. We consider the application of two methods, each rooted in nonlinear mixed effects (NLME) models, that test the relationship relationships between estimated cell line DRCs and factors that might mitigate response. Both methods leverage estimation and testing techniques that consider the simultaneous analysis of different cell lines to draw inferences about any one cell line. One of the methods is designed to provide an omnibus test of the differences between cell line DRCs that is not focused on any one aspect of the DRC (such as the IC 50 value). We simulated different settings and compared the different methods on the simulated data. We also compared the proposed methods against traditional IC 50 -based methods using 40 melanoma cell lines whose transcriptomes, proteomes, and, importantly, BRAF and related mutation profiles were available. Ultimately, we find that the NLME-based methods are more robust, powerful and, for the omnibus test, more flexible, than traditional methods. Their application to the melanoma cell lines reveals insights into factors that may be clinically useful.

  17. Effectiveness and cost-effectiveness of antidepressants in primary care: a multiple treatment comparison meta-analysis and cost-effectiveness model.

    Directory of Open Access Journals (Sweden)

    Joakim Ramsberg

    Full Text Available OBJECTIVE: To determine effectiveness and cost-effectiveness over a one-year time horizon of pharmacological first line treatment in primary care for patients with moderate to severe depression. DESIGN: A multiple treatment comparison meta-analysis was employed to determine the relative efficacy in terms of remission of 10 antidepressants (citalopram, duloxetine escitalopram, fluoxetine, fluvoxamine mirtazapine, paroxetine, reboxetine, sertraline and venlafaxine. The estimated remission rates were then applied in a decision-analytic model in order to estimate costs and quality of life with different treatments at one year. DATA SOURCES: Meta-analyses of remission rates from randomised controlled trials, and cost and quality-of-life data from published sources. RESULTS: The most favourable pharmacological treatment in terms of remission was escitalopram with an 8- to 12-week probability of remission of 0.47. Despite a high acquisition cost, this clinical effectiveness translated into escitalopram being both more effective and having a lower total cost than all other comparators from a societal perspective. From a healthcare perspective, the cost per QALY of escitalopram was €3732 compared with venlafaxine. CONCLUSION: Of the investigated antidepressants, escitalopram has the highest probability of remission and is the most effective and cost-effective pharmacological treatment in a primary care setting, when evaluated over a one year time-horizon. Small differences in remission rates may be important when assessing costs and cost-effectiveness of antidepressants.

  18. Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.

    Science.gov (United States)

    Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai

    2018-03-01

    With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.

  19. High-Throughput Printing Process for Flexible Electronics

    Science.gov (United States)

    Hyun, Woo Jin

    Printed electronics is an emerging field for manufacturing electronic devices with low cost and minimal material waste for a variety of applications including displays, distributed sensing, smart packaging, and energy management. Moreover, its compatibility with roll-to-roll production formats and flexible substrates is desirable for continuous, high-throughput production of flexible electronics. Despite the promise, however, the roll-to-roll production of printed electronics is quite challenging due to web movement hindering accurate ink registration and high-fidelity printing. In this talk, I will present a promising strategy for roll-to-roll production using a novel printing process that we term SCALE (Self-aligned Capillarity-Assisted Lithography for Electronics). By utilizing capillarity of liquid inks on nano/micro-structured substrates, the SCALE process facilitates high-resolution and self-aligned patterning of electrically functional inks with greatly improved printing tolerance. I will show the fabrication of key building blocks (e.g. transistor, resistor, capacitor) for electronic circuits using the SCALE process on plastics.

  20. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Directory of Open Access Journals (Sweden)

    Gafurov Andrey

    2018-01-01

    Full Text Available The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the “Project analysis scenario” flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  1. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    Science.gov (United States)

    Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir

    2018-03-01

    The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  2. A high throughput mass spectrometry screening analysis based on two-dimensional carbon microfiber fractionation system.

    Science.gov (United States)

    Ma, Biao; Zou, Yilin; Xie, Xuan; Zhao, Jinhua; Piao, Xiangfan; Piao, Jingyi; Yao, Zhongping; Quinto, Maurizio; Wang, Gang; Li, Donghao

    2017-06-09

    A novel high-throughput, solvent saving and versatile integrated two-dimensional microscale carbon fiber/active carbon fiber system (2DμCFs) that allows a simply and rapid separation of compounds in low-polar, medium-polar and high-polar fractions, has been coupled with ambient ionization-mass spectrometry (ESI-Q-TOF-MS and ESI-QqQ-MS) for screening and quantitative analyses of real samples. 2DμCFs led to a substantial interference reduction and minimization of ionization suppression effects, thus increasing the sensitivity and the screening capabilities of the subsequent MS analysis. The method has been applied to the analysis of Schisandra Chinensis extracts, obtaining with a single injection a simultaneous determination of 33 compounds presenting different polarities, such as organic acids, lignans, and flavonoids in less than 7min, at low pressures and using small solvent amounts. The method was also validated using 10 model compounds, giving limit of detections (LODs) ranging from 0.3 to 30ngmL -1 , satisfactory recoveries (from 75.8 to 93.2%) and reproducibilities (relative standard deviations, RSDs, from 1.40 to 8.06%). Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Cost-effectiveness analysis of the introduction of rotavirus vaccine in Iran.

    Science.gov (United States)

    Javanbakht, Mehdi; Moradi-Lakeh, Maziar; Yaghoubi, Mohsen; Esteghamati, Abdoulreza; Mansour Ghanaie, Roxana; Mahmoudi, Sussan; Shamshiri, Ahmad-Reza; Zahraei, Seyed Mohsen; Baxter, Louise; Shakerian, Sareh; Chaudhri, Irtaza; Fleming, Jessica A; Munier, Aline; Baradaran, Hamid R

    2015-05-07

    Although the mortality from diarrheal diseases has been decreasing dramatically in Iran, it still represents an important proportion of disease burden in children Rotavirus vaccines are among the most effective strategies against diarrheal diseases in specific epidemiological conditions. This study aimed to evaluate the cost-effectiveness of the introduction of rotavirus vaccine (3 doses of pentavalent RotaTeq (RV5)) in Iran, from the viewpoints of Iran's health system and society. The TRIVAC decision support model was used to calculate total incremental costs, life years (LYs) gained, and disability-adjusted life years (DALYs) averted due to the vaccination program. Necessary input data were collected from the most valid accessible sources as well as a systematic review and meta-analysis on epidemiological studies. We used WHO guidelines to estimate vaccination cost. An annual discount rate of 3% was considered for both health gain and costs. A deterministic sensitivity analysis was performed for testing the robustness of the models results. Our results indicated that total DALYs potentially lost due to rotavirus diarrhea within 10 years would be 138,161, of which 76,591 could be prevented by rotavirus vaccine. The total vaccination cost for 10 cohorts was estimated to be US$ 499.91 million. Also, US$ 470.61 million would be saved because of preventing outpatient visits and inpatient admissions (cost-saving from the society perspective). We estimated a cost per DALY averted of US$ 2868 for RV5 vaccination, which corresponds to a highly cost-effective strategy from the government perspective. In the sensitivity analysis, all scenarios tested were still cost-saving or highly cost-effective from the society perspective, except in the least favorable scenario and low vaccine efficacy and disease incidence scenario. Based on the findings, introduction of rotavirus vaccine is a highly cost-effective strategy from the government perspective. Introducing the vaccine to

  4. High-throughput and automated diagnosis of antimicrobial resistance using a cost-effective cellphone-based micro-plate reader

    Science.gov (United States)

    Feng, Steve; Tseng, Derek; di Carlo, Dino; Garner, Omai B.; Ozcan, Aydogan

    2016-12-01

    Routine antimicrobial susceptibility testing (AST) can prevent deaths due to bacteria and reduce the spread of multi-drug-resistance, but cannot be regularly performed in resource-limited-settings due to technological challenges, high-costs, and lack of trained professionals. We demonstrate an automated and cost-effective cellphone-based 96-well microtiter-plate (MTP) reader, capable of performing AST without the need for trained diagnosticians. Our system includes a 3D-printed smartphone attachment that holds and illuminates the MTP using a light-emitting-diode array. An inexpensive optical fiber-array enables the capture of the transmitted light of each well through the smartphone camera. A custom-designed application sends the captured image to a server to automatically determine well-turbidity, with results returned to the smartphone in ~1 minute. We tested this mobile-reader using MTPs prepared with 17 antibiotics targeting Gram-negative bacteria on clinical isolates of Klebsiella pneumoniae, containing highly-resistant antimicrobial profiles. Using 78 patient isolate test-plates, we demonstrated that our mobile-reader meets the FDA-defined AST criteria, with a well-turbidity detection accuracy of 98.21%, minimum-inhibitory-concentration accuracy of 95.12%, and a drug-susceptibility interpretation accuracy of 99.23%, with no very major errors. This mobile-reader could eliminate the need for trained diagnosticians to perform AST, reduce the cost-barrier for routine testing, and assist in spatio-temporal tracking of bacterial resistance.

  5. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  6. High Throughput Multispectral Image Processing with Applications in Food Science.

    Directory of Open Access Journals (Sweden)

    Panagiotis Tsakanikas

    Full Text Available Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  7. High Throughput Multispectral Image Processing with Applications in Food Science.

    Science.gov (United States)

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  8. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  9. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists. © 2016 by The Author(s) 2016.

  10. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  11. Toward a Low-Cost System for High-Throughput Image-Based Phenotyping of Root System Architecture

    Science.gov (United States)

    Davis, T. W.; Schneider, D. J.; Cheng, H.; Shaw, N.; Kochian, L. V.; Shaff, J. E.

    2015-12-01

    Root system architecture is being studied more closely for improved nutrient acquisition, stress tolerance and carbon sequestration by relating the genetic material that corresponds to preferential physical features. This information can help direct plant breeders in addressing the growing concerns regarding the global demand on crops and fossil fuels. To help support this incentive comes a need to make high-throughput image-based phenotyping of plant roots, at the individual plant scale, simpler and more affordable. Our goal is to create an affordable and portable product for simple image collection, processing and management that will extend root phenotyping to institutions with limited funding (e.g., in developing countries). Thus, a new integrated system has been developed using the Raspberry Pi single-board computer. Similar to other 3D-based imaging platforms, the system utilizes a stationary camera to photograph a rotating crop root system (e.g., rice, maize or sorghum) that is suspended either in a gel or on a mesh (for hydroponics). In contrast, the new design takes advantage of powerful open-source hardware and software to reduce the system costs, simplify the imaging process, and manage the large datasets produced by the high-resolution photographs. A newly designed graphical user interface (GUI) unifies the system controls (e.g., adjusting camera and motor settings and orchestrating the motor motion with image capture), making it easier to accommodate a variety of experiments. During each imaging session, integral metadata necessary for reproducing experiment results are collected (e.g., plant type and age, growing conditions and treatments, camera settings) using hierarchical data format files. These metadata are searchable within the GUI and can be selected and extracted for further analysis. The GUI also supports an image previewer that performs limited image processing (e.g., thresholding and cropping). Root skeletonization, 3D reconstruction and

  12. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  13. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  14. High-throughput gated photon counter with two detection windows programmable down to 70 ps width

    Energy Technology Data Exchange (ETDEWEB)

    Boso, Gianluca; Tosi, Alberto, E-mail: alberto.tosi@polimi.it; Zappa, Franco [Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Piazza Leonardo Da Vinci 32, 20133 Milano (Italy); Mora, Alberto Dalla [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo Da Vinci 32, 20133 Milano (Italy)

    2014-01-15

    We present the design and characterization of a high-throughput gated photon counter able to count electrical pulses occurring within two well-defined and programmable detection windows. We extensively characterized and validated this instrument up to 100 Mcounts/s and with detection window width down to 70 ps. This instrument is suitable for many applications and proves to be a cost-effective and compact alternative to time-correlated single-photon counting equipment, thanks to its easy configurability, user-friendly interface, and fully adjustable settings via a Universal Serial Bus (USB) link to a remote computer.

  15. High-throughput gated photon counter with two detection windows programmable down to 70 ps width

    International Nuclear Information System (INIS)

    Boso, Gianluca; Tosi, Alberto; Zappa, Franco; Mora, Alberto Dalla

    2014-01-01

    We present the design and characterization of a high-throughput gated photon counter able to count electrical pulses occurring within two well-defined and programmable detection windows. We extensively characterized and validated this instrument up to 100 Mcounts/s and with detection window width down to 70 ps. This instrument is suitable for many applications and proves to be a cost-effective and compact alternative to time-correlated single-photon counting equipment, thanks to its easy configurability, user-friendly interface, and fully adjustable settings via a Universal Serial Bus (USB) link to a remote computer

  16. Robust high-throughput batch screening method in 384-well format with optical in-line resin quantification.

    Science.gov (United States)

    Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen

    2015-04-15

    High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Parallel workflow for high-throughput (>1,000 samples/day quantitative analysis of human insulin-like growth factor 1 using mass spectrometric immunoassay.

    Directory of Open Access Journals (Sweden)

    Paul E Oran

    Full Text Available Insulin-like growth factor 1 (IGF1 is an important biomarker for the management of growth hormone disorders. Recently there has been rising interest in deploying mass spectrometric (MS methods of detection for measuring IGF1. However, widespread clinical adoption of any MS-based IGF1 assay will require increased throughput and speed to justify the costs of analyses, and robust industrial platforms that are reproducible across laboratories. Presented here is an MS-based quantitative IGF1 assay with performance rating of >1,000 samples/day, and a capability of quantifying IGF1 point mutations and posttranslational modifications. The throughput of the IGF1 mass spectrometric immunoassay (MSIA benefited from a simplified sample preparation step, IGF1 immunocapture in a tip format, and high-throughput MALDI-TOF MS analysis. The Limit of Detection and Limit of Quantification of the resulting assay were 1.5 μg/L and 5 μg/L, respectively, with intra- and inter-assay precision CVs of less than 10%, and good linearity and recovery characteristics. The IGF1 MSIA was benchmarked against commercially available IGF1 ELISA via Bland-Altman method comparison test, resulting in a slight positive bias of 16%. The IGF1 MSIA was employed in an optimized parallel workflow utilizing two pipetting robots and MALDI-TOF-MS instruments synced into one-hour phases of sample preparation, extraction and MSIA pipette tip elution, MS data collection, and data processing. Using this workflow, high-throughput IGF1 quantification of 1,054 human samples was achieved in approximately 9 hours. This rate of assaying is a significant improvement over existing MS-based IGF1 assays, and is on par with that of the enzyme-based immunoassays. Furthermore, a mutation was detected in ∼1% of the samples (SNP: rs17884626, creating an A→T substitution at position 67 of the IGF1, demonstrating the capability of IGF1 MSIA to detect point mutations and posttranslational modifications.

  18. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Directory of Open Access Journals (Sweden)

    Rok Gaber

    2013-11-01

    Full Text Available To effectively fight against the human immunodeficiency virus infection/ acquired immunodeficiency syndrome (HIV/AIDS epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity.

  19. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Science.gov (United States)

    Gaber, Rok; Majerle, Andreja; Jerala, Roman; Benčina, Mojca

    2013-01-01

    To effectively fight against the human immunodeficiency virus infection/acquired immunodeficiency syndrome (HIV/AIDS) epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET)-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity. PMID:24287545

  20. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Daniel J. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Tfaily, Malak M. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Moore, Ronald J. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; LaMarche, Brian L. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Zheng, Xueyun [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Fillmore, Thomas L. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Chu, Rosalie K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Weitz, Karl K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Monroe, Matthew E. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Kelly, Ryan T. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Smith, Richard D. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States; Baker, Erin S. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland, WA 99352, United States

    2017-12-13

    To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, low pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.

  1. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  2. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    Directory of Open Access Journals (Sweden)

    Husted Søren

    2009-09-01

    Full Text Available Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. Results We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight. A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds, the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm closely matched the total content obtained by analysis of the whole rice grain. Conclusion A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at

  3. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  4. A high-throughput microfluidic dental plaque biofilm system to visualize and quantify the effect of antimicrobials

    Science.gov (United States)

    Nance, William C.; Dowd, Scot E.; Samarian, Derek; Chludzinski, Jeffrey; Delli, Joseph; Battista, John; Rickard, Alexander H.

    2013-01-01

    Objectives Few model systems are amenable to developing multi-species biofilms in parallel under environmentally germane conditions. This is a problem when evaluating the potential real-world effectiveness of antimicrobials in the laboratory. One such antimicrobial is cetylpyridinium chloride (CPC), which is used in numerous over-the-counter oral healthcare products. The aim of this work was to develop a high-throughput microfluidic system that is combined with a confocal laser scanning microscope (CLSM) to quantitatively evaluate the effectiveness of CPC against oral multi-species biofilms grown in human saliva. Methods Twenty-four-channel BioFlux microfluidic plates were inoculated with pooled human saliva and fed filter-sterilized saliva for 20 h at 37°C. The bacterial diversity of the biofilms was evaluated by bacterial tag-encoded FLX amplicon pyrosequencing (bTEFAP). The antimicrobial/anti-biofilm effect of CPC (0.5%–0.001% w/v) was examined using Live/Dead stain, CLSM and 3D imaging software. Results The analysis of biofilms by bTEFAP demonstrated that they contained genera typically found in human dental plaque. These included Aggregatibacter, Fusobacterium, Neisseria, Porphyromonas, Streptococcus and Veillonella. Using Live/Dead stain, clear gradations in killing were observed when the biofilms were treated with CPC between 0.5% and 0.001% w/v. At 0.5% (w/v) CPC, 90% of the total signal was from dead/damaged cells. Below this concentration range, less killing was observed. In the 0.5%–0.05% (w/v) range CPC penetration/killing was greatest and biofilm thickness was significantly reduced. Conclusions This work demonstrates the utility of a high-throughput microfluidic–CLSM system to grow multi-species oral biofilms, which are compositionally similar to naturally occurring biofilms, to assess the effectiveness of antimicrobials. PMID:23800904

  5. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  6. A high-throughput colorimetric assay for glucose detection based on glucose oxidase-catalyzed enlargement of gold nanoparticles

    Science.gov (United States)

    Xiong, Yanmei; Zhang, Yuyan; Rong, Pengfei; Yang, Jie; Wang, Wei; Liu, Dingbin

    2015-09-01

    We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose.We developed a simple high-throughput colorimetric assay to detect glucose based on the glucose oxidase (GOx)-catalysed enlargement of gold nanoparticles (AuNPs). Compared with the currently available glucose kit method, the AuNP-based assay provides higher clinical sensitivity at lower cost, indicating its great potential to be a powerful tool for clinical screening of glucose. Electronic supplementary information (ESI) available: Experimental section and additional figures. See DOI: 10.1039/c5nr03758a

  7. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  8. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    Science.gov (United States)

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications. © 2016 Elsevier Inc. All rights reserved.

  9. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  10. Some Observations on Cost-Effectiveness Analysis in Education.

    Science.gov (United States)

    Geske, Terry G.

    1979-01-01

    The general nature of cost-effectiveness analysis is discussed, analytical frameworks for conducting cost-effectiveness studies are described, and some of the problems inherent in measuring educational costs and in assessing program effectiveness are addressed. (Author/IRT)

  11. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  12. Bacterial diversity of the Colombian fermented milk "Suero Costeño" assessed by culturing and high-throughput sequencing and DGGE analysis of 16S rRNA gene amplicons.

    Science.gov (United States)

    Motato, Karina Edith; Milani, Christian; Ventura, Marco; Valencia, Francia Elena; Ruas-Madiedo, Patricia; Delgado, Susana

    2017-12-01

    "Suero Costeño" (SC) is a traditional soured cream elaborated from raw milk in the Northern-Caribbean coast of Colombia. The natural microbiota that characterizes this popular Colombian fermented milk is unknown, although several culturing studies have previously been attempted. In this work, the microbiota associated with SC from three manufacturers in two regions, "Planeta Rica" (Córdoba) and "Caucasia" (Antioquia), was analysed by means of culturing methods in combination with high-throughput sequencing and DGGE analysis of 16S rRNA gene amplicons. The bacterial ecosystem of SC samples was revealed to be composed of lactic acid bacteria belonging to the Streptococcaceae and Lactobacillaceae families; the proportions and genera varying among manufacturers and region of elaboration. Members of the Lactobacillus acidophilus group, Lactocococcus lactis, Streptococcus infantarius and Streptococcus salivarius characterized this artisanal product. In comparison with culturing, the use of molecular in deep culture-independent techniques provides a more realistic picture of the overall bacterial communities residing in SC. Besides the descriptive purpose, these approaches will facilitate a rational strategy to follow (culture media and growing conditions) for the isolation of indigenous strains that allow standardization in the manufacture of SC. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Offering lung cancer screening to high-risk medicare beneficiaries saves lives and is cost-effective: an actuarial analysis.

    Science.gov (United States)

    Pyenson, Bruce S; Henschke, Claudia I; Yankelevitz, David F; Yip, Rowena; Dec, Ellynne

    2014-08-01

    By a wide margin, lung cancer is the most significant cause of cancer death in the United States and worldwide. The incidence of lung cancer increases with age, and Medicare beneficiaries are often at increased risk. Because of its demonstrated effectiveness in reducing mortality, lung cancer screening with low-dose computed tomography (LDCT) imaging will be covered without cost-sharing starting January 1, 2015, by nongrandfathered commercial plans. Medicare is considering coverage for lung cancer screening. To estimate the cost and cost-effectiveness (ie, cost per life-year saved) of LDCT lung cancer screening of the Medicare population at high risk for lung cancer. Medicare costs, enrollment, and demographics were used for this study; they were derived from the 2012 Centers for Medicare & Medicaid Services (CMS) beneficiary files and were forecast to 2014 based on CMS and US Census Bureau projections. Standard life and health actuarial techniques were used to calculate the cost and cost-effectiveness of lung cancer screening. The cost, incidence rates, mortality rates, and other parameters chosen by the authors were taken from actual Medicare data, and the modeled screenings are consistent with Medicare processes and procedures. Approximately 4.9 million high-risk Medicare beneficiaries would meet criteria for lung cancer screening in 2014. Without screening, Medicare patients newly diagnosed with lung cancer have an average life expectancy of approximately 3 years. Based on our analysis, the average annual cost of LDCT lung cancer screening in Medicare is estimated to be $241 per person screened. LDCT screening for lung cancer in Medicare beneficiaries aged 55 to 80 years with a history of ≥30 pack-years of smoking and who had smoked within 15 years is low cost, at approximately $1 per member per month. This assumes that 50% of these patients were screened. Such screening is also highly cost-effective, at <$19,000 per life-year saved. If all eligible Medicare

  14. Life in the fast lane: high-throughput chemistry for lead generation and optimisation.

    Science.gov (United States)

    Hunter, D

    2001-01-01

    The pharmaceutical industry has come under increasing pressure due to regulatory restrictions on the marketing and pricing of drugs, competition, and the escalating costs of developing new drugs. These forces can be addressed by the identification of novel targets, reductions in the development time of new drugs, and increased productivity. Emphasis has been placed on identifying and validating new targets and on lead generation: the response from industry has been very evident in genomics and high throughput screening, where new technologies have been applied, usually coupled with a high degree of automation. The combination of numerous new potential biological targets and the ability to screen large numbers of compounds against many of these targets has generated the need for large diverse compound collections. To address this requirement, high-throughput chemistry has become an integral part of the drug discovery process. Copyright 2002 Wiley-Liss, Inc.

  15. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    Science.gov (United States)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  16. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free...

  17. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  18. Subtyping of swine influenza viruses using a high-throughput real time PCR platform

    DEFF Research Database (Denmark)

    Goecke, Nicole Bakkegård; Krog, Jesper Schak; Hjulsager, Charlotte Kristiane

    ). The results revealed that the performance of the dynamic chip was similar to conventional real time analysis. Discussion and conclusion. Application of the chip for subtyping of swine influenza has resulted in a significant reduction in time, cost and working hours. Thereby, it is possible to offer diagnostic...... test and subsequent subtyping is performed by real time RT-PCR (RT-qPCR) but several assays are needed to cover the wide range of circulating subtypes which is expensive,resource and time demanding. To mitigate these restrictions the high-throughput qPCR platform BioMark (Fluidigm) has been explored...... services with reduced price and turnover time which will facilitate choice of vaccines and by that lead to reduction of antibiotic used....

  19. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  20. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  1. High-Throughput Quantification of Nanoparticle Degradation Using Computational Microscopy and Its Application to Drug Delivery Nanocapsules

    KAUST Repository

    Ray, Aniruddha

    2017-04-25

    Design and synthesis of degradable nanoparticles are very important in drug delivery and biosensing fields. Although accurate assessment of nanoparticle degradation rate would improve the characterization and optimization of drug delivery vehicles, current methods rely on estimating the size of the particles at discrete points over time using, for example, electron microscopy or dynamic light scattering (DLS), among other techniques, all of which have drawbacks and practical limitations. There is a significant need for a high-throughput and cost-effective technology to accurately monitor nanoparticle degradation as a function of time and using small amounts of sample. To address this need, here we present two different computational imaging-based methods for monitoring and quantification of nanoparticle degradation. The first method is suitable for discrete testing, where a computational holographic microscope is designed to track the size changes of protease-sensitive protein-core nanoparticles following degradation, by periodically sampling a subset of particles mixed with proteases. In the second method, a sandwich structure was utilized to observe, in real-time, the change in the properties of liquid nanolenses that were self-assembled around degrading nanoparticles, permitting continuous monitoring and quantification of the degradation process. These cost-effective holographic imaging based techniques enable high-throughput monitoring of the degradation of any type of nanoparticle, using an extremely small amount of sample volume that is at least 3 orders of magnitude smaller than what is required by, for example, DLS-based techniques.

  2. Novel high-throughput cell-based hybridoma screening methodology using the Celigo Image Cytometer.

    Science.gov (United States)

    Zhang, Haohai; Chan, Leo Li-Ying; Rice, William; Kassam, Nasim; Longhi, Maria Serena; Zhao, Haitao; Robson, Simon C; Gao, Wenda; Wu, Yan

    2017-08-01

    Hybridoma screening is a critical step for antibody discovery, which necessitates prompt identification of potential clones from hundreds to thousands of hybridoma cultures against the desired immunogen. Technical issues associated with ELISA- and flow cytometry-based screening limit accuracy and diminish high-throughput capability, increasing time and cost. Conventional ELISA screening with coated antigen is also impractical for difficult-to-express hydrophobic membrane antigens or multi-chain protein complexes. Here, we demonstrate novel high-throughput screening methodology employing the Celigo Image Cytometer, which avoids nonspecific signals by contrasting antibody binding signals directly on living cells, with and without recombinant antigen expression. The image cytometry-based high-throughput screening method was optimized by detecting the binding of hybridoma supernatants to the recombinant antigen CD39 expressed on Chinese hamster ovary (CHO) cells. Next, the sensitivity of the image cytometer was demonstrated by serial dilution of purified CD39 antibody. Celigo was used to measure antibody affinities of commercial and in-house antibodies to membrane-bound CD39. This cell-based screening procedure can be completely accomplished within one day, significantly improving throughput and efficiency of hybridoma screening. Furthermore, measuring direct antibody binding to living cells eliminated both false positive and false negative hits. The image cytometry method was highly sensitive and versatile, and could detect positive antibody in supernatants at concentrations as low as ~5ng/mL, with concurrent K d binding affinity coefficient determination. We propose that this screening method will greatly facilitate antibody discovery and screening technologies. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. High-throughput screening of saliva for early detection of oral cancer: a pilot study.

    Science.gov (United States)

    Szanto, I; Mark, L; Bona, A; Maasz, G; Sandor, B; Gelencser, G; Turi, Z; Gallyas, F

    2012-04-01

    The success of tumour therapy depends considerably on early diagnosis. Therefore, we aimed to develop a widely available, cheap, non-invasive, high-throughput method suitable for screening high-risk populations, at least, for early signs of malignant transformation in the oral cavity. First, in order to identify suitable tumour marker candidates, we compared the protein patterns of five selected saliva samples obtained from healthy controls and tumour patients after electrophoretic separation, excised the bands that were consistently up-regulated in the tumour patients only, and performed matrix-assisted laser-desorption ionisation (MALDI)-time of flight (TOF) tandem mass spectrometry (MS/MS) analysis of the proteins in these bands after in-gel tryptic digestion. From the panel of proteins identified, we chose annexin 1 and peroxiredoxin 2 for further studies based on their presence in the saliva of all five oral cancer patients only. Then, we performed a homology search of protein databases using the primary sequence of each in silico tryptic fragment peptide of these two proteins as bait, and selected a unique peptide for each. Finally, we performed targeted MALDI-TOF MS peptide analysis in a blinded fashion on all samples obtained from 20 healthy controls and 22 tumour patients for the presence of these peptides. We found both peptides present in the saliva samples of all cancer patients only. Even though these tumour markers should be validated in a wider population, our results indicate that targeted MALDI-TOF MS analysis of unique peptides of putative saliva protein tumour biomarkers could be the method of choice for cost-efficient, high-throughput screening for the early detection of oral cancer.

  4. Cyber-T web server: differential analysis of high-throughput data.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-07-01

    The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.

  5. Ontology-based meta-analysis of global collections of high-throughput public data.

    Directory of Open Access Journals (Sweden)

    Ilya Kupershmidt

    2010-09-01

    Full Text Available The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today.We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets.Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  6. Ontology-based meta-analysis of global collections of high-throughput public data.

    Science.gov (United States)

    Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa

    2010-09-29

    The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  7. High-throughput metabolic state analysis: The missing link in integrated functional genomics of yeasts

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Moxley, Joel. F; Åkesson, Mats Fredrik

    2005-01-01

    that achieve comparable throughput, effort and cost compared with DNA arrays. Our sample workup method enables simultaneous metabolite measurements throughout central carbon metabolism and amino acid biosynthesis, using a standard GC-MS platform that was optimized for this Purpose. As an implementation proof......-of-concept, we assayed metabolite levels in two yeast strains and two different environmental conditions in the context of metabolic pathway reconstruction. We demonstrate that these differential metabolite level data distinguish among sample types, such as typical metabolic fingerprinting or footprinting. More...

  8. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    Science.gov (United States)

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  9. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    Science.gov (United States)

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  10. Cancer panomics: computational methods and infrastructure for integrative analysis of cancer high-throughput "omics" data

    DEFF Research Database (Denmark)

    Brunak, Søren; De La Vega, Francisco M.; Rätsch, Gunnar

    2014-01-01

    Targeted cancer treatment is becoming the goal of newly developed oncology medicines and has already shown promise in some spectacular cases such as the case of BRAF kinase inhibitors in BRAF-mutant (e.g. V600E) melanoma. These developments are driven by the advent of high-throughput sequencing......, which continues to drop in cost, and that has enabled the sequencing of the genome, transcriptome, and epigenome of the tumors of a large number of cancer patients in order to discover the molecular aberrations that drive the oncogenesis of several types of cancer. Applying these technologies...... in the clinic promises to transform cancer treatment by identifying therapeutic vulnerabilities of each patient's tumor. These approaches will need to address the panomics of cancer--the integration of the complex combination of patient-specific characteristics that drive the development of each person's tumor...

  11. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  12. A high-throughput, multi-channel photon-counting detector with picosecond timing

    International Nuclear Information System (INIS)

    Lapington, J.S.; Fraser, G.W.; Miller, G.M.; Ashton, T.J.R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  13. Development of high-throughput analysis system using highly-functional organic polymer monoliths

    International Nuclear Information System (INIS)

    Umemura, Tomonari; Kojima, Norihisa; Ueki, Yuji

    2008-01-01

    The growing demand for high-throughput analysis in the current competitive life sciences and industries has promoted the development of high-speed HPLC techniques and tools. As one of such tools, monolithic columns have attracted increasing attention and interest in the last decade due to the low flow-resistance and excellent mass transfer, allowing for rapid separations and reactions at high flow rates with minimal loss of column efficiency. Monolithic materials are classified into two main groups: silica- and organic polymer-based monoliths, each with their own advantages and disadvantages. Organic polymer monoliths have several distinct advantages in life-science research, including wide pH stability, less irreversible adsorption, facile preparation and modification. Thus, we have so far tried to develop organic polymer monoliths for various chemical operations, such as separation, extraction, preconcentration, and reaction. In the present paper, recent progress in the development of organic polymer monoliths is discussed. Especially, the procedure for the preparation of methacrylate-based monoliths with various functional groups is described, where the influence of different compositional and processing parameters on the monolithic structure is also addressed. Furthermore, the performance of the produced monoliths is demonstrated through the results for (1) rapid separations of alklybenzenes at high flow rates, (2) flow-through enzymatic digestion of cytochrome c on a trypsin-immobilized monolithic column, and (3) separation of the tryptic digest on a reversed-phase monolithic column. The flexibility and versatility of organic polymer monoliths will be beneficial for further enhancing analytical performance, and will open the way for new applications and opportunities both in scientific and industrial research. (author)

  14. Cost-effectiveness analysis of rotavirus vaccination in Argentina.

    Science.gov (United States)

    Urueña, Analía; Pippo, Tomás; Betelu, María Sol; Virgilio, Federico; Hernández, Laura; Giglio, Norberto; Gentile, Ángela; Diosque, Máximo; Vizzotti, Carla

    2015-05-07

    Rotavirus is a leading cause of severe diarrhea in children under 5. In Argentina, the most affected regions are the Northeast and Northwest, where hospitalizations and deaths are more frequent. This study estimated the cost-effectiveness of adding either of the two licensed rotavirus vaccines to the routine immunization schedule. The integrated TRIVAC vaccine cost-effectiveness model from the Pan American Health Organization's ProVac Initiative (Version 2.0) was used to assess health benefits, costs savings, life-years gained (LYGs), DALYs averted, and cost/DALY averted of vaccinating 10 successive cohorts, from the health care system and societal perspectives. Two doses of monovalent (RV1) rotavirus vaccine and three doses of pentavalent (RV5) rotavirus vaccine were each compared to a scenario assuming no vaccination. The price/dose was US$ 7.50 and US$ 5.15 for RV1 and RV5, respectively. We ran both a national and sub-national analysis, discounting all costs and benefits 3% annually. Our base case results were compared to a range of alternative univariate and multivariate scenarios. The number of LYGs was 5962 and 6440 for RV1 and RV5, respectively. The cost/DALY averted when compared to no vaccination from the health care system and societal perspective was: US$ 3870 and US$ 1802 for RV1, and US$ 2414 and US$ 358 for RV5, respectively. Equivalent figures for the Northeast were US$ 1470 and US$ 636 for RV1, and US$ 913 and US$ 80 for RV5. Therefore, rotavirus vaccination was more cost-effective in the Northeast compared to the whole country; and, in the Northwest, health service's costs saved outweighed the cost of introducing the vaccine. Vaccination with either vaccine compared to no vaccination was highly cost-effective based on WHO guidelines and Argentina's 2011 per capita GDP of US$ 9090. Key variables influencing results were vaccine efficacy, annual loss of efficacy, relative coverage of deaths, vaccine price, and discount rate. Compared to no

  15. A simpler sampling interface of venturi easy ambient sonic-spray ionization mass spectrometry for high-throughput screening enzyme inhibitors.

    Science.gov (United States)

    Liu, Ning; Liu, Yang; Yang, YuHan; He, Lan; Ouyang, Jin

    2016-03-24

    High-throughput screening (HTS) is often required in enzyme inhibitor drugs screening. Mass spectrometry (MS) provides a powerful method for high-throughput screening enzyme inhibitors because its high speed, sensitivity and property of lable free. However, most of the MS methods need complicated sampling interface system. Overall throughput was limited by sample loading in these cases. In this study, we develop a simple interface which coupled droplet segmented system to a venturi easy ambient sonic-spray ionization mass spectrometer. It is fabricated by using a single capillary to act as both sampling probe and the emitter, which simplifies the construction, reduces the cost and shorten the sampling time. Samples sucked by venturi effect are segmented to nanoliter plugs by air, then the plugs can be detected by MS directly. This system eliminated the need for flow injection which was popular used in classic scheme. The new system is applied to screen angiotensin converting enzyme inhibitors. High-throughput was achieved in analyzing 96 samples at 1.6 s per sample. The plugs formation was at 0.5s per sample. Carry-over between samples was less than 5%, the peak height RSD was 2.92% (n = 15). Dose-response curves of 3 known inhibitors were also measured to validate its potential in drug discovery. The calculated IC50 agreed well with reported values. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Bacterial Pathogens and Community Composition in Advanced Sewage Treatment Systems Revealed by Metagenomics Analysis Based on High-Throughput Sequencing

    Science.gov (United States)

    Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying

    2015-01-01

    This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence. PMID:25938416

  17. Reverse Phase Protein Arrays for High-throughput Toxicity Screening

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    High-throughput screening is extensively applied for identification of drug targets and drug discovery and recently it found entry into toxicity testing. Reverse phase protein arrays (RPPAs) are used widespread for quantification of protein markers. We reasoned that RPPAs also can be utilized...... beneficially in automated high-throughput toxicity testing. An advantage of using RPPAs is that, in addition to the baseline toxicity readout, they allow testing of multiple markers of toxicity, such as inflammatory responses, which do not necessarily cumulate in cell death. We used transfection of si......RNAs with known killing effects as a model system to demonstrate that RPPA-based protein quantification can serve as substitute readout of cell viability, hereby reliably reflecting toxicity. In terms of automation, cell exposure, protein harvest, serial dilution and sample reformatting were performed using...

  18. Nanosphere Templating Through Controlled Evaporation: A High Throughput Method For Building SERS Substrates

    Science.gov (United States)

    Alexander, Kristen; Hampton, Meredith; Lopez, Rene; Desimone, Joseph

    2009-03-01

    When a pair of noble metal nanoparticles are brought close together, the plasmonic properties of the pair (known as a ``dimer'') give rise to intense electric field enhancements in the interstitial gap. These fields present a simple yet exquisitely sensitive system for performing single molecule surface-enhanced Raman spectroscopy (SM-SERS). Problems associated with current fabrication methods of SERS-active substrates include reproducibility issues, high cost of production and low throughput. In this study, we present a novel method for the high throughput fabrication of high quality SERS substrates. Using a polymer templating technique followed by the placement of thiolated nanoparticles through meniscus force deposition, we are able to fabricate large arrays of identical, uniformly spaced dimers in a quick, reproducible manner. Subsequent theoretical and experimental studies have confirmed the strong dependence of the SERS enhancement on both substrate geometry (e.g. dimer size, shape and gap size) and the polarization of the excitation source.

  19. Quantum dots for a high-throughput Pfu polymerase based multi-round polymerase chain reaction (PCR).

    Science.gov (United States)

    Sang, Fuming; Zhang, Zhizhou; Yuan, Lin; Liu, Deli

    2018-02-26

    Multi-round PCR is an important technique for obtaining enough target DNA from rare DNA resources, and is commonly used in many fields including forensic science, ancient DNA analysis and cancer research. However, multi-round PCR is often aborted, largely due to the accumulation of non-specific amplification during repeated amplifications. Here, we developed a Pfu polymerase based multi-round PCR technique assisted by quantum dots (QDs). Different PCR assays, DNA polymerases (Pfu and Taq), DNA sizes and GC amounts were compared in this study. In the presence of QDs, PCR specificity could be retained even in the ninth-round amplification. Moreover, the longer and more complex the targets were, the earlier the abortion happened in multi-round PCR. However, no obvious enhancement of specificity was found in multi-round PCR using Taq DNA polymerase. Significantly, the fidelity of Pfu polymerase based multi-round PCR was not sacrificed in the presence of QDs. Besides, pre-incubation at 50 °C for an hour had no impact on multi-round PCR performance, which further authenticated the hot start effect of QDs modulated in multi-round PCR. The findings of this study demonstrated that a cost-effective and promising multi-round PCR technique for large-scale and high-throughput sample analysis could be established with high specificity, sensibility and accuracy.

  20. High-throughput screening (HTS) and modeling of the retinoid ...

    Science.gov (United States)

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  1. Using high-throughput barcode sequencing to efficiently map connectomes.

    Science.gov (United States)

    Peikon, Ian D; Kebschull, Justus M; Vagin, Vasily V; Ravens, Diana I; Sun, Yu-Chi; Brouzes, Eric; Corrêa, Ivan R; Bressan, Dario; Zador, Anthony M

    2017-07-07

    The function of a neural circuit is determined by the details of its synaptic connections. At present, the only available method for determining a neural wiring diagram with single synapse precision-a 'connectome'-is based on imaging methods that are slow, labor-intensive and expensive. Here, we present SYNseq, a method for converting the connectome into a form that can exploit the speed and low cost of modern high-throughput DNA sequencing. In SYNseq, each neuron is labeled with a unique random nucleotide sequence-an RNA 'barcode'-which is targeted to the synapse using engineered proteins. Barcodes in pre- and postsynaptic neurons are then associated through protein-protein crosslinking across the synapse, extracted from the tissue, and joined into a form suitable for sequencing. Although our failure to develop an efficient barcode joining scheme precludes the widespread application of this approach, we expect that with further development SYNseq will enable tracing of complex circuits at high speed and low cost. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. The cost-effectiveness and cost-utility of high-dose palliative radiotherapy for advanced non-small-cell lung cancer

    International Nuclear Information System (INIS)

    Coy, Peter; Schaafsma, Joseph; Schofield, John A.

    2000-01-01

    Purpose: To compute cost-effectiveness/cost-utility (CE/CU) ratios, from the treatment clinic and societal perspectives, for high-dose palliative radiotherapy treatment (RT) for advanced non-small-cell lung cancer (NSCLC) against best supportive care (BSC) as comparator, and thereby demonstrate a method for computing CE/CU ratios when randomized clinical trial (RCT) data cannot be generated. Methods and Materials: Unit cost estimates based on an earlier reported 1989-90 analysis of treatment costs at the Vancouver Island Cancer Centre, Victoria, British Columbia, Canada, are updated to 1997-1998 and then used to compute the incremental cost of an average dose of high-dose palliative RT. The incremental number of life days and quality-adjusted life days (QALDs) attributable to treatment are from earlier reported regression analyses of the survival and quality-of-life data from patients who enrolled prospectively in a lung cancer management cost-effectiveness study at the clinic over a 2-year period from 1990 to 1992. Results: The baseline CE and CU ratios are $9245 Cdn per life year (LY) and $12,836 per quality-adjusted life year (QALY), respectively, from the clinic perspective; and $12,253/LY and $17,012/QALY, respectively, from the societal perspective. Multivariate sensitivity analysis for the CE ratio produces a range of $5513-28,270/LY from the clinic perspective, and $7307-37,465/LY from the societal perspective. Similar calculations for the CU ratio produce a range of $7205-37,134/QALY from the clinic perspective, and $9550-49,213/QALY from the societal perspective. Conclusion: The cost effectiveness and cost utility of high-dose palliative RT for advanced NSCLC compares favorably with the cost effectiveness of other forms of treatment for NSCLC, of treatments of other forms of cancer, and of many other commonly used medical interventions; and lies within the US $50,000/QALY benchmark often cited for cost-effective care

  3. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  4. Cost-effectiveness Analysis with Influence Diagrams.

    Science.gov (United States)

    Arias, M; Díez, F J

    2015-01-01

    Cost-effectiveness analysis (CEA) is used increasingly in medicine to determine whether the health benefit of an intervention is worth the economic cost. Decision trees, the standard decision modeling technique for non-temporal domains, can only perform CEA for very small problems. To develop a method for CEA in problems involving several dozen variables. We explain how to build influence diagrams (IDs) that explicitly represent cost and effectiveness. We propose an algorithm for evaluating cost-effectiveness IDs directly, i.e., without expanding an equivalent decision tree. The evaluation of an ID returns a set of intervals for the willingness to pay - separated by cost-effectiveness thresholds - and, for each interval, the cost, the effectiveness, and the optimal intervention. The algorithm that evaluates the ID directly is in general much more efficient than the brute-force method, which is in turn more efficient than the expansion of an equivalent decision tree. Using OpenMarkov, an open-source software tool that implements this algorithm, we have been able to perform CEAs on several IDs whose equivalent decision trees contain millions of branches. IDs can perform CEA on large problems that cannot be analyzed with decision trees.

  5. High-throughput spectrometer designs in a compact form-factor: principles and applications

    Science.gov (United States)

    Norton, S. M.

    2013-05-01

    Many compact, portable Raman spectrometers have entered the market in the past few years with applications in narcotics and hazardous material identification, as well as verification applications in pharmaceuticals and security screening. Often, the required compact form-factor has forced designers to sacrifice throughput and sensitivity for portability and low-cost. We will show that a volume phase holographic (VPH)-based spectrometer design can achieve superior throughput and thus sensitivity over conventional Czerny-Turner reflective designs. We will look in depth at the factors influencing throughput and sensitivity and illustrate specific VPH-based spectrometer examples that highlight these design principles.

  6. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  7. A high-throughput method for assessing chemical toxicity using a Caenorhabditis elegans reproduction assay

    International Nuclear Information System (INIS)

    Boyd, Windy A.; McBride, Sandra J.; Rice, Julie R.; Snyder, Daniel W.; Freedman, Jonathan H.

    2010-01-01

    The National Research Council has outlined the need for non-mammalian toxicological models to test the potential health effects of a large number of chemicals while also reducing the use of traditional animal models. The nematode Caenorhabditis elegans is an attractive alternative model because of its well-characterized and evolutionarily conserved biology, low cost, and ability to be used in high-throughput screening. A high-throughput method is described for quantifying the reproductive capacity of C. elegans exposed to chemicals for 48 h from the last larval stage (L4) to adulthood using a COPAS Biosort. Initially, the effects of exposure conditions that could influence reproduction were defined. Concentrations of DMSO vehicle ≤ 1% did not affect reproduction. Previous studies indicated that C. elegans may be influenced by exposure to low pH conditions. At pHs greater than 4.5, C. elegans reproduction was not affected; however below this pH there was a significant decrease in the number of offspring. Cadmium chloride was chosen as a model toxicant to verify that automated measurements were comparable to those of traditional observational studies. EC 50 values for cadmium for automated measurements (176-192 μM) were comparable to those previously reported for a 72-h exposure using manual counting (151 μM). The toxicity of seven test toxicants on C. elegans reproduction was highly correlative with rodent lethality suggesting that this assay may be useful in predicting the potential toxicity of chemicals in other organisms.

  8. The main challenges that remain in applying high-throughput sequencing to clinical diagnostics.

    Science.gov (United States)

    Loeffelholz, Michael; Fofanov, Yuriy

    2015-01-01

    Over the last 10 years, the quality, price and availability of high-throughput sequencing instruments have improved to the point that this technology may be close to becoming a routine tool in the diagnostic microbiology laboratory. Two groups of challenges, however, have to be resolved in order to move this powerful research technology into routine use in the clinical microbiology laboratory. The computational/bioinformatics challenges include data storage cost and privacy concerns, requiring analysis to be performed without access to cloud storage or expensive computational infrastructure. The logistical challenges include interpretation of complex results and acceptance and understanding of the advantages and limitations of this technology by the medical community. This article focuses on the approaches to address these challenges, such as file formats, algorithms, data collection, reporting and good laboratory practices.

  9. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  10. Quantitative description on structure-property relationships of Li-ion battery materials for high-throughput computations

    Science.gov (United States)

    Wang, Youwei; Zhang, Wenqing; Chen, Lidong; Shi, Siqi; Liu, Jianjun

    2017-12-01

    Li-ion batteries are a key technology for addressing the global challenge of clean renewable energy and environment pollution. Their contemporary applications, for portable electronic devices, electric vehicles, and large-scale power grids, stimulate the development of high-performance battery materials with high energy density, high power, good safety, and long lifetime. High-throughput calculations provide a practical strategy to discover new battery materials and optimize currently known material performances. Most cathode materials screened by the previous high-throughput calculations cannot meet the requirement of practical applications because only capacity, voltage and volume change of bulk were considered. It is important to include more structure-property relationships, such as point defects, surface and interface, doping and metal-mixture and nanosize effects, in high-throughput calculations. In this review, we established quantitative description of structure-property relationships in Li-ion battery materials by the intrinsic bulk parameters, which can be applied in future high-throughput calculations to screen Li-ion battery materials. Based on these parameterized structure-property relationships, a possible high-throughput computational screening flow path is proposed to obtain high-performance battery materials.

  11. Optimization and high-throughput screening of antimicrobial peptides.

    Science.gov (United States)

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  12. High-throughput tandem mass spectrometry multiplex analysis for newborn urinary screening of creatine synthesis and transport disorders, Triple H syndrome and OTC deficiency.

    Science.gov (United States)

    Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela

    2014-09-25

    Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Nonintravenous rescue medications for pediatric status epilepticus: A cost-effectiveness analysis.

    Science.gov (United States)

    Sánchez Fernández, Iván; Gaínza-Lein, Marina; Loddenkemper, Tobias

    2017-08-01

    To quantify the cost-effectiveness of rescue medications for pediatric status epilepticus: rectal diazepam, nasal midazolam, buccal midazolam, intramuscular midazolam, and nasal lorazepam. Decision analysis model populated with effectiveness data from the literature and cost data from publicly available market prices. The primary outcome was cost per seizure stopped ($/SS). One-way sensitivity analyses and second-order Monte Carlo simulations evaluated the robustness of the results across wide variations of the input parameters. The most cost-effective rescue medication was buccal midazolam (incremental cost-effectiveness ratio ([ICER]: $13.16/SS) followed by nasal midazolam (ICER: $38.19/SS). Nasal lorazepam (ICER: -$3.8/SS), intramuscular midazolam (ICER: -$64/SS), and rectal diazepam (ICER: -$2,246.21/SS) are never more cost-effective than the other options at any willingness to pay. One-way sensitivity analysis showed the following: (1) at its current effectiveness, rectal diazepam would become the most cost-effective option only if its cost was $6 or less, and (2) at its current cost, rectal diazepam would become the most cost-effective option only if effectiveness was higher than 0.89 (and only with very high willingness to pay of $2,859/SS to $31,447/SS). Second-order Monte Carlo simulations showed the following: (1) nasal midazolam and intramuscular midazolam were the more effective options; (2) the more cost-effective option was buccal midazolam for a willingness to pay from $14/SS to $41/SS and nasal midazolam for a willingness to pay above $41/SS; (3) cost-effectiveness overlapped for buccal midazolam, nasal lorazepam, intramuscular midazolam, and nasal midazolam; and (4) rectal diazepam was not cost-effective at any willingness to pay, and this conclusion remained extremely robust to wide variations of the input parameters. For pediatric status epilepticus, buccal midazolam and nasal midazolam are the most cost-effective nonintravenous rescue

  14. Application of high-throughput sequencing in understanding human oral microbiome related with health and disease

    OpenAIRE

    Chen, Hui; Jiang, Wen

    2014-01-01

    The oral microbiome is one of most diversity habitat in the human body and they are closely related with oral health and disease. As the technique developing,, high throughput sequencing has become a popular approach applied for oral microbial analysis. Oral bacterial profiles have been studied to explore the relationship between microbial diversity and oral diseases such as caries and periodontal disease. This review describes the application of high-throughput sequencing for characterizati...

  15. Cost-effective analysis of PET application in NSCLC

    International Nuclear Information System (INIS)

    Gu Aichun; Liu Jianjun; Sun Xiaoguang; Shi Yiping; Huang Gang

    2006-01-01

    Objective: To evaluate the cost-effectiveness of PET and CT application for diagnosis of non-small cell lung cancer (NSCLC) in China. Methods: Using decision analysis method the diagnostic efficiency of PET and CT for diagnosis of NSCLC in china was analysed. And also the value of cost for accurate diagnosis (CAD), cost for accurate staging (CAS) and cost for effective therapy (CAT) was calculated. Results: (1) For the accurate diagnosis, CT was much more cost-effective than PET. (2) For the accurate staging, CT was still more cost-effective than PET. (3) For the all over diagnostic and therapeutic cost, PET was more cost-effective than CT. (4) The priority of PET to CT was for the diagnosis of stage I NSCLC. Conclusion: For the management of NSCLC patient in China, CT is more cost-effective for screening, whereas PET for clinical staging and monitoring therapeutic effect. (authors)

  16. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  17. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  18. Zebrafish: A marvel of high-throughput biology for 21st century toxicology.

    Science.gov (United States)

    Bugel, Sean M; Tanguay, Robert L; Planchart, Antonio

    2014-09-07

    The evolutionary conservation of genomic, biochemical and developmental features between zebrafish and humans is gradually coming into focus with the end result that the zebrafish embryo model has emerged as a powerful tool for uncovering the effects of environmental exposures on a multitude of biological processes with direct relevance to human health. In this review, we highlight advances in automation, high-throughput (HT) screening, and analysis that leverage the power of the zebrafish embryo model for unparalleled advances in our understanding of how chemicals in our environment affect our health and wellbeing.

  19. A high-throughput protocol for mutation scanning of the BRCA1 and BRCA2 genes

    International Nuclear Information System (INIS)

    Hondow, Heather L; Fox, Stephen B; Mitchell, Gillian; Scott, Rodney J; Beshay, Victoria; Wong, Stephen Q; Dobrovic, Alexander

    2011-01-01

    Detection of mutations by DNA sequencing can be facilitated by scanning methods to identify amplicons which may have mutations. Current scanning methods used for the detection of germline sequence variants are laborious as they require post-PCR manipulation. High resolution melting (HRM) is a cost-effective rapid screening strategy, which readily detects heterozygous variants by melting curve analysis of PCR products. It is well suited to screening genes such as BRCA1 and BRCA2 as germline pathogenic mutations in these genes are always heterozygous. Assays for the analysis of all coding regions and intron-exon boundaries of BRCA1 and BRCA2 were designed, and optimised. A final set of 94 assays which ran under identical amplification conditions were chosen for BRCA1 (36) and BRCA2 (58). Significant attention was placed on primer design to enable reproducible detection of mutations within the amplicon while minimising unnecessary detection of polymorphisms. Deoxyinosine residues were incorporated into primers that overlay intronic polymorphisms. Multiple 384 well plates were used to facilitate high throughput. 169 BRCA1 and 239 BRCA2 known sequence variants were used to test the amplicons. We also performed an extensive blinded validation of the protocol with 384 separate patient DNAs. All heterozygous variants were detected with the optimised assays. This is the first HRM approach to screen the entire coding region of the BRCA1 and BRCA2 genes using one set of reaction conditions in a multi plate 384 well format using specifically designed primers. The parallel screening of a relatively large number of samples enables better detection of sequence variants. HRM has the advantages of decreasing the necessary sequencing by more than 90%. This markedly reduced cost of sequencing will result in BRCA1 and BRCA2 mutation testing becoming accessible to individuals who currently do not undergo mutation testing because of the significant costs involved

  20. High throughput two-step ultrasonic spray deposited CH3NH3PbI3 thin film layer for solar cell application

    Science.gov (United States)

    Lan, Ding-Hung; Hong, Shao-Huan; Chou, Li-Hui; Wang, Xiao-Feng; Liu, Cheng-Liang

    2018-06-01

    Organometal halide perovskite materials have demonstrated tremendous advances in the photovoltaic field recently because of their advantageous features of simple fabrication and high power conversion efficiency. To meet the high demand for high throughput and cost-effective, we present a wet process method that enables the probing of the parameters for perovskite layer deposition through two-step sequential ultrasonic spray-coating. This paper describes a detailed investigation on the effects of modification of spray precursor solution (PbI2 and CH3NH3I precursor concentration and solvents used) and post-annealing condition (temperature and time), which can be performed to create optimal film quality as well as improve device efficiency. Through the systematic optimization, the inverted planar perovskite solar cells show the reproducible photovoltaic properties with best power conversion efficiency (PCE) of 10.40% and average PCE of 9.70 ± 0.40%. A continuous spray-coating technique for rapid fabrication of total 16 pieces of perovskite films was demonstrated for providing a viable alternative for the high throughput production of the perovskite solar cells.

  1. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements.

    Science.gov (United States)

    Orton, Daniel J; Tfaily, Malak M; Moore, Ronald J; LaMarche, Brian L; Zheng, Xueyun; Fillmore, Thomas L; Chu, Rosalie K; Weitz, Karl K; Monroe, Matthew E; Kelly, Ryan T; Smith, Richard D; Baker, Erin S

    2018-01-02

    To better understand disease conditions and environmental perturbations, multiomic studies combining proteomic, lipidomic, and metabolomic analyses are vastly increasing in popularity. In a multiomic study, a single sample is typically extracted in multiple ways, and various analyses are performed using different instruments, most often based upon mass spectrometry (MS). Thus, one sample becomes many measurements, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injections. While some FIA systems have been created to address these challenges, many have limitations such as costly consumables, low pressure capabilities, limited pressure monitoring, and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at a range of flow rates (∼50 nL/min to 500 μL/min) to accommodate both low- and high-flow MS ionization sources. This system also functions at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system, and results showed a highly robust and reproducible platform capable of providing consistent performance over many days without carryover, as long as washing buffers specific to each molecular analysis were utilized.

  2. Making choices in health: WHO guide to cost effectiveness analysis

    National Research Council Canada - National Science Library

    Tan Torres Edejer, Tessa

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . XXI PART ONE: METHODS COST-EFFECTIVENESS FOR GENERALIZED ANALYSIS 1. 2. What is Generalized Cost-Effectiveness Analysis? . . . . . . . . . . . . 3 Undertaking...

  3. High-Throughput Screening and Quantitation of Target Compounds in Biofluids by Coated Blade Spray-Mass Spectrometry.

    Science.gov (United States)

    Tascon, Marcos; Gómez-Ríos, Germán Augusto; Reyes-Garcés, Nathaly; Poole, Justen; Boyacı, Ezel; Pawliszyn, Janusz

    2017-08-15

    Most contemporary methods of screening and quantitating controlled substances and therapeutic drugs in biofluids typically require laborious, time-consuming, and expensive analytical workflows. In recent years, our group has worked toward developing microextraction (μe)-mass spectrometry (MS) technologies that merge all of the tedious steps of the classical methods into a simple, efficient, and low-cost methodology. Unquestionably, the automation of these technologies allows for faster sample throughput, greater reproducibility, and radically reduced analysis times. Coated blade spray (CBS) is a μe technology engineered for extracting/enriching analytes of interest in complex matrices, and it can be directly coupled with MS instruments to achieve efficient screening and quantitative analysis. In this study, we introduced CBS as a technology that can be arranged to perform either rapid diagnostics (single vial) or the high-throughput (96-well plate) analysis of biofluids. Furthermore, we demonstrate that performing 96-CBS extractions at the same time allows the total analysis time to be reduced to less than 55 s per sample. Aiming to validate the versatility of CBS, substances comprising a broad range of molecular weights, moieties, protein binding, and polarities were selected. Thus, the high-throughput (HT)-CBS technology was used for the concomitant quantitation of 18 compounds (mixture of anabolics, β-2 agonists, diuretics, stimulants, narcotics, and β-blockers) spiked in human urine and plasma samples. Excellent precision (∼2.5%), accuracy (≥90%), and linearity (R 2 ≥ 0.99) were attained for all the studied compounds, and the limits of quantitation (LOQs) were within the range of 0.1 to 10 ng·mL -1 for plasma and 0.25 to 10 ng·mL -1 for urine. The results reported in this paper confirm CBS's great potential for achieving subsixty-second analyses of target compounds in a broad range of fields such as those related to clinical diagnosis, food, the

  4. Quantitative in vitro-to-in vivo extrapolation in a high-throughput environment

    International Nuclear Information System (INIS)

    Wetmore, Barbara A.

    2015-01-01

    High-throughput in vitro toxicity screening provides an efficient way to identify potential biological targets for environmental and industrial chemicals while conserving limited testing resources. However, reliance on the nominal chemical concentrations in these in vitro assays as an indicator of bioactivity may misrepresent potential in vivo effects of these chemicals due to differences in clearance, protein binding, bioavailability, and other pharmacokinetic factors. Development of high-throughput in vitro hepatic clearance and protein binding assays and refinement of quantitative in vitro-to-in vivo extrapolation (QIVIVE) methods have provided key tools to predict xenobiotic steady state pharmacokinetics. Using a process known as reverse dosimetry, knowledge of the chemical steady state behavior can be incorporated with HTS data to determine the external in vivo oral exposure needed to achieve internal blood concentrations equivalent to those eliciting bioactivity in the assays. These daily oral doses, known as oral equivalents, can be compared to chronic human exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. This review will describe the use of QIVIVE methods in a high-throughput environment and the promise they hold in shaping chemical testing priorities and, potentially, high-throughput risk assessment strategies

  5. DESIGN OF LOW EPI AND HIGH THROUGHPUT CORDIC CELL TO IMPROVE THE PERFORMANCE OF MOBILE ROBOT

    Directory of Open Access Journals (Sweden)

    P. VELRAJKUMAR

    2014-04-01

    Full Text Available This paper mainly focuses on pass logic based design, which gives an low Energy Per Instruction (EPI and high throughput COrdinate Rotation Digital Computer (CORDIC cell for application of robotic exploration. The basic components of CORDIC cell namely register, multiplexer and proposed adder is designed using pass transistor logic (PTL design. The proposed adder is implemented in bit-parallel iterative CORDIC circuit whereas designed using DSCH2 VLSI CAD tool and their layouts are generated by Microwind 3 VLSI CAD tool. The propagation delay, area and power dissipation are calculated from the simulated results for proposed adder based CORDIC cell. The EPI, throughput and effect of temperature are calculated from generated layout. The output parameter of generated layout is analysed using BSIM4 advanced analyzer. The simulated result of the proposed adder based CORDIC circuit is compared with other adder based CORDIC circuits. From the analysis of these simulated results, it was found that the proposed adder based CORDIC circuit dissipates low power, gives faster response, low EPI and high throughput.

  6. Fundamental understanding and development of low-cost, high-efficiency silicon solar cells

    Energy Technology Data Exchange (ETDEWEB)

    ROHATGI,A.; NARASIMHA,S.; MOSCHER,J.; EBONG,A.; KAMRA,S.; KRYGOWSKI,T.; DOSHI,P.; RISTOW,A.; YELUNDUR,V.; RUBY,DOUGLAS S.

    2000-05-01

    The overall objectives of this program are (1) to develop rapid and low-cost processes for manufacturing that can improve yield, throughput, and performance of silicon photovoltaic devices, (2) to design and fabricate high-efficiency solar cells on promising low-cost materials, and (3) to improve the fundamental understanding of advanced photovoltaic devices. Several rapid and potentially low-cost technologies are described in this report that were developed and applied toward the fabrication of high-efficiency silicon solar cells.

  7. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina Lundgaard; Login, Frédéric H.; Jensen, Helene Halkjær

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacteria...

  8. Cost-effectiveness of increasing access to mammography through mobile mammography for older women.

    Science.gov (United States)

    Naeim, Arash; Keeler, Emmett; Bassett, Lawrence W; Parikh, Jay; Bastani, Roshan; Reuben, David B

    2009-02-01

    To compare the costs of mobile and stationary mammography and examine the incremental cost-effectiveness of using mobile mammography to increase screening rates. A cost-effectiveness analysis was performed using effectiveness data from a randomized clinical trial and modeling of costs associated with the mobile mammography intervention. The trial involved 60 community-based meal sites, senior centers, and clubs. Four hundred ninety-nine individuals were enrolled in the study, of whom 463 had outcome data available for analysis. Costs were calculated for stationary and mobile mammography, as well as costs due to differences in technology and film versus digital. Incremental cost-effectiveness (cost per additional screen) was modeled, and sensitivity analysis was performed by altering efficiency (throughput) and effectiveness based on subgroup data from the randomized trial. The estimated annual costs were $435,162 for a stationary unit, $539,052 for a mobile film unit, and $456, 392 for a mobile digital unit. Assuming mobile units are less efficient (50% annual volume), the cost per screen was $41 for a stationary unit, $86 for a mobile film unit, and $102 for a mobile digital unit. The incremental cost per additional screen were $207 for a mobile film unit and $264 for a mobile digital unit over a stationary unit. Although mobile mammography is a more effective way to screen older women, the absolute cost per screen of mobile units is higher, whereas the reimbursement is no different. Financial barriers may impede the widespread use of this approach.

  9. MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra

    Science.gov (United States)

    Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.

    2018-04-01

    The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.

  10. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  11. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    Science.gov (United States)

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  12. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  13. High throughput imaging cytometer with acoustic focussing.

    Science.gov (United States)

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  14. High-throughput GPU-based LDPC decoding

    Science.gov (United States)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  15. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    Science.gov (United States)

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  16. A Cost-Effectiveness Analysis Model for Evaluating and Planning Secondary Vocational Programs

    Science.gov (United States)

    Kim, Jin Eun

    1977-01-01

    This paper conceptualizes a cost-effectiveness analysis and describes a cost-effectiveness analysis model for secondary vocational programs. It generates three kinds of cost-effectiveness measures: program effectiveness, cost efficiency, and cost-effectiveness and/or performance ratio. (Author)

  17. High-throughput simultaneous analysis of RNA, protein, and lipid biomarkers in heterogeneous tissue samples.

    Science.gov (United States)

    Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S

    2011-11-01

    With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.

  18. Pathway Processor 2.0: a web resource for pathway-based analysis of high-throughput data.

    Science.gov (United States)

    Beltrame, Luca; Bianco, Luca; Fontana, Paolo; Cavalieri, Duccio

    2013-07-15

    Pathway Processor 2.0 is a web application designed to analyze high-throughput datasets, including but not limited to microarray and next-generation sequencing, using a pathway centric logic. In addition to well-established methods such as the Fisher's test and impact analysis, Pathway Processor 2.0 offers innovative methods that convert gene expression into pathway expression, leading to the identification of differentially regulated pathways in a dataset of choice. Pathway Processor 2.0 is available as a web service at http://compbiotoolbox.fmach.it/pathwayProcessor/. Sample datasets to test the functionality can be used directly from the application. duccio.cavalieri@fmach.it Supplementary data are available at Bioinformatics online.

  19. Differential Gene Expression and Concentration-Response Modeling Workflow for High-Throughput Transcriptomic (HTTr) Data: Results From MCF7 Cells

    Science.gov (United States)

    Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...

  20. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-04-01

    Lab-on-Chip, the miniaturization of the chemical and analytical lab, is an endeavor that seems to come out of science fiction yet is slowly becoming a reality. It is a multidisciplinary field that combines different areas of science and engineering. Within these areas, microfluidics is a specialized field that deals with the behavior, control and manipulation of small volumes of fluids. Agglutination assays are rapid, single-step, low-cost immunoassays that use microspheres to detect a wide variety molecules and pathogens by using a specific antigen-antibody interaction. Agglutination assays are particularly suitable for the miniaturization and automation that two-phase microfluidics can offer, a combination that can help tackle the ever pressing need of high-throughput screening for blood banks, epidemiology, food banks diagnosis of infectious diseases. In this thesis, we present a two-phase microfluidic system capable of incubating and quantifying agglutination assays. The microfluidic channel is a simple fabrication solution, using laboratory tubing. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5-10 fold improvement over traditional agglutination assays. It has a user-friendly interface that that does not require droplet generators, in which a pipette is used to continuously insert assays on-demand, with no down-time in between experiments at 360 assays/h. System parameters are explored, using the streptavidin-biotin interaction as a model assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two-phase ow format. The application can be potentially applied to other biomarkers, which we demonstrate using C-reactive protein (CRP) assays. Using our system, we can take a commercially available CRP qualitative slide

  2. High-throughput analysis of endogenous fruit glycosyl hydrolases using a novel chromogenic hydrogel substrate assay

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Lausen, Thomas Frederik

    2017-01-01

    A broad range of enzyme activities can be found in a wide range of different fruits and fruiting bodies but there is a lack of methods where many samples can be handled in a high-throughput and efficient manner. In particular, plant polysaccharide degrading enzymes – glycosyl hydrolases (GHs) play...... led to a more profound understanding of the importance of GH activity and regulation, current methods for determining glycosyl hydrolase activity are lacking in throughput and fail to keep up with data output from transcriptome research. Here we present the use of a versatile, easy...

  3. An RNA-Based Fluorescent Biosensor for High-Throughput Analysis of the cGAS-cGAMP-STING Pathway.

    Science.gov (United States)

    Bose, Debojit; Su, Yichi; Marcus, Assaf; Raulet, David H; Hammond, Ming C

    2016-12-22

    In mammalian cells, the second messenger (2'-5',3'-5') cyclic guanosine monophosphate-adenosine monophosphate (2',3'-cGAMP), is produced by the cytosolic DNA sensor cGAMP synthase (cGAS), and subsequently bound by the stimulator of interferon genes (STING) to trigger interferon response. Thus, the cGAS-cGAMP-STING pathway plays a critical role in pathogen detection, as well as pathophysiological conditions including cancer and autoimmune disorders. However, studying and targeting this immune signaling pathway has been challenging due to the absence of tools for high-throughput analysis. We have engineered an RNA-based fluorescent biosensor that responds to 2',3'-cGAMP. The resulting "mix-and-go" cGAS activity assay shows excellent statistical reliability as a high-throughput screening (HTS) assay and distinguishes between direct and indirect cGAS inhibitors. Furthermore, the biosensor enables quantitation of 2',3'-cGAMP in mammalian cell lysates. We envision this biosensor-based assay as a resource to study the cGAS-cGAMP-STING pathway in the context of infectious diseases, cancer immunotherapy, and autoimmune diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Synthetic Biomaterials to Rival Nature's Complexity-a Path Forward with Combinatorics, High-Throughput Discovery, and High-Content Analysis.

    Science.gov (United States)

    Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A

    2017-10-01

    Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Cost-effectiveness analysis of thermotherapy versus pentavalent antimonials for the treatment of cutaneous leishmaniasis.

    Science.gov (United States)

    Cardona-Arias, Jaiberth Antonio; López-Carvajal, Liliana; Tamayo Plata, Mery Patricia; Vélez, Iván Darío

    2017-05-01

    The treatment of cutaneous leishmaniasis is toxic, has contraindications, and a high cost. The objective of this study was to estimate the cost-effectiveness of thermotherapy versus pentavalent antimonials for the treatment of cutaneous leishmaniasis. Effectiveness was the proportion of healing and safety with the adverse effects; these parameters were estimated from a controlled clinical trial and a meta-analysis. A standard costing was conducted. Average and incremental cost-effectiveness ratios were estimated. The uncertainty regarding effectiveness, safety, and costs was determined through sensitivity analyses. The total costs were $66,807 with Glucantime and $14,079 with thermotherapy. The therapeutic effectiveness rates were 64.2% for thermotherapy and 85.1% for Glucantime. The average cost-effectiveness ratios ranged between $721 and $1275 for Glucantime and between $187 and $390 for thermotherapy. Based on the meta-analysis, thermotherapy may be a dominant strategy. The excellent cost-effectiveness ratio of thermotherapy shows the relevance of its inclusion in guidelines for the treatment. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  6. Targeted Capture and High-Throughput Sequencing Using Molecular Inversion Probes (MIPs).

    Science.gov (United States)

    Cantsilieris, Stuart; Stessman, Holly A; Shendure, Jay; Eichler, Evan E

    2017-01-01

    Molecular inversion probes (MIPs) in combination with massively parallel DNA sequencing represent a versatile, yet economical tool for targeted sequencing of genomic DNA. Several thousand genomic targets can be selectively captured using long oligonucleotides containing unique targeting arms and universal linkers. The ability to append sequencing adaptors and sample-specific barcodes allows large-scale pooling and subsequent high-throughput sequencing at relatively low cost per sample. Here, we describe a "wet bench" protocol detailing the capture and subsequent sequencing of >2000 genomic targets from 192 samples, representative of a single lane on the Illumina HiSeq 2000 platform.

  7. The high throughput virtual slit enables compact, inexpensive Raman spectral imagers

    Science.gov (United States)

    Gooding, Edward; Deutsch, Erik R.; Huehnerhoff, Joseph; Hajian, Arsen R.

    2018-02-01

    Raman spectral imaging is increasingly becoming the tool of choice for field-based applications such as threat, narcotics and hazmat detection; air, soil and water quality monitoring; and material ID. Conventional fiber-coupled point source Raman spectrometers effectively interrogate a small sample area and identify bulk samples via spectral library matching. However, these devices are very slow at mapping over macroscopic areas. In addition, the spatial averaging performed by instruments that collect binned spectra, particularly when used in combination with orbital raster scanning, tends to dilute the spectra of trace particles in a mixture. Our design, employing free space line illumination combined with area imaging, reveals both the spectral and spatial content of heterogeneous mixtures. This approach is well suited to applications such as detecting explosives and narcotics trace particle detection in fingerprints. The patented High Throughput Virtual Slit1 is an innovative optical design that enables compact, inexpensive handheld Raman spectral imagers. HTVS-based instruments achieve significantly higher spectral resolution than can be obtained with conventional designs of the same size. Alternatively, they can be used to build instruments with comparable resolution to large spectrometers, but substantially smaller size, weight and unit cost, all while maintaining high sensitivity. When used in combination with laser line imaging, this design eliminates sample photobleaching and unwanted photochemistry while greatly enhancing mapping speed, all with high selectivity and sensitivity. We will present spectral image data and discuss applications that are made possible by low cost HTVS-enabled instruments.

  8. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  9. Crystal Symmetry Algorithms in a High-Throughput Framework for Materials

    Science.gov (United States)

    Taylor, Richard

    The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.

  10. A high throughput architecture for a low complexity soft-output demapping algorithm

    Science.gov (United States)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  11. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  12. Micellar Surfactant Association in the Presence of a Glucoside-based Amphiphile Detected via High-Throughput Small Angle X-ray Scattering

    Energy Technology Data Exchange (ETDEWEB)

    Stanic, Vesna [Brazilian Synchrotron Light Source, Campinas (Brazil); Broadbent, Charlotte [Columbia Univ., New York, NY (United States). Engineering Dept.; DiMasi, Elaine [Brookhaven National Lab. (BNL), Upton, NY (United States). Photon Sciences Division; Galleguillos, Ramiro [Lubrizol Advanced Materials, Cleveland, OH (United States); Woodward, Valerie [Lubrizol Advanced Materials, Cleveland, OH (United States)

    2016-11-14

    The interactions of mixtures of anionic and amphoteric surfactants with sugar amphiphiles were studied via high throughput small angle x-ray scattering (SAXS). The sugar amphiphile was composed of Caprate, Caprylate, and Oleate mixed ester of methyl glucoside, MeGCCO. Optimal surfactant interactions are sought which have desirable physical properties, which must be identified in a cost effective manner that can access the large phase space of possible molecular combinations. X-ray scattering patterns obtained via high throughput SAXS can probe a combinatorial sample space and reveal the incorporation of MeGCCO into the micelles and the molecular associations between surfactant molecules. Such data make it possible to efficiently assess the effects of the new amphiphiles in the formulation. A specific finding of this study is that formulations containing comparatively monodisperse and homogeneous surfactant mixtures can be reliably tuned by addition of NaCl, which swells the surfactant micelles with a monotonic dependence on salt concentration. In contrast, the presence of multiple different surfactants destroys clear correlations with NaCl concentration, even in otherwise similar series of formulations.

  13. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  14. Cost-effectiveness and value of information analysis of nutritional support for preventing pressure ulcers in high-risk patients: implement now, research later.

    Science.gov (United States)

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2015-04-01

    Pressure ulcers are a major cause of mortality, morbidity, and increased healthcare cost. Nutritional support may reduce the incidence of pressure ulcers in hospitalised patients who are at risk of pressure ulcer and malnutrition. To evaluate the cost-effectiveness of nutritional support in preventing pressure ulcers in high-risk hospitalised patients, and to assess the value of further research to inform the decision to implement this intervention using value of information analysis (VOI). The analysis was from the perspective of Queensland Health, Australia using a decision model with evidence derived from a systematic review and meta-analysis. Resources were valued using 2014 prices and the time horizon of the analysis was one year. Monte Carlo simulation was used to estimate net monetary benefits (NB) and to calculate VOI measures. Compared with standard hospital diet, nutritional support was cost saving at AU$425 per patient, and more effective with an average 0.005 quality-adjusted life years (QALY) gained. At a willingness-to-pay of AU$50,000 per QALY, the incremental NB was AU$675 per patient, with a probability of 87 % that nutritional support is cost-effective. The expected value of perfect information was AU$5 million and the expected value of perfect parameter information was highest for the relative risk of developing a pressure ulcer at AU$2.5 million. For a future trial investigating the relative effectiveness of the interventions, the expected net benefit of research would be maximised at AU$100,000 with 1,200 patients in each arm if nutritional support was perfectly implemented. The opportunity cost of withholding the decision to implement the intervention until the results of the future study are available would be AU$14 million. Nutritional support is cost-effective in preventing pressure ulcers in high-risk hospitalised patients compared with standard diet. Future research to reduce decision uncertainty is worthwhile; however, given the

  15. High throughput generated micro-aggregates of chondrocytes stimulate cartilage formation in vitro and in vivo

    Directory of Open Access Journals (Sweden)

    LS Moreira Teixeira

    2012-06-01

    Full Text Available Cell-based cartilage repair strategies such as matrix-induced autologous chondrocyte implantation (MACI could be improved by enhancing cell performance. We hypothesised that micro-aggregates of chondrocytes generated in high-throughput prior to implantation in a defect could stimulate cartilaginous matrix deposition and remodelling. To address this issue, we designed a micro-mould to enable controlled high-throughput formation of micro-aggregates. Morphology, stability, gene expression profiles and chondrogenic potential of micro-aggregates of human and bovine chondrocytes were evaluated and compared to single-cells cultured in micro-wells and in 3D after encapsulation in Dextran-Tyramine (Dex-TA hydrogels in vitro and in vivo. We successfully formed micro-aggregates of human and bovine chondrocytes with highly controlled size, stability and viability within 24 hours. Micro-aggregates of 100 cells presented a superior balance in Collagen type I and Collagen type II gene expression over single cells and micro-aggregates of 50 and 200 cells. Matrix metalloproteinases 1, 9 and 13 mRNA levels were decreased in micro-aggregates compared to single-cells. Histological and biochemical analysis demonstrated enhanced matrix deposition in constructs seeded with micro-aggregates cultured in vitro and in vivo, compared to single-cell seeded constructs. Whole genome microarray analysis and single gene expression profiles using human chondrocytes confirmed increased expression of cartilage-related genes when chondrocytes were cultured in micro-aggregates. In conclusion, we succeeded in controlled high-throughput formation of micro-aggregates of chondrocytes. Compared to single cell-seeded constructs, seeding of constructs with micro-aggregates greatly improved neo-cartilage formation. Therefore, micro-aggregation prior to chondrocyte implantation in current MACI procedures, may effectively accelerate hyaline cartilage formation.

  16. High throughput generated micro-aggregates of chondrocytes stimulate cartilage formation in vitro and in vivo.

    Science.gov (United States)

    Moreira Teixeira, L S; Leijten, J C H; Sobral, J; Jin, R; van Apeldoorn, A A; Feijen, J; van Blitterswijk, C; Dijkstra, P J; Karperien, M

    2012-06-05

    Cell-based cartilage repair strategies such as matrix-induced autologous chondrocyte implantation (MACI) could be improved by enhancing cell performance. We hypothesised that micro-aggregates of chondrocytes generated in high-throughput prior to implantation in a defect could stimulate cartilaginous matrix deposition and remodelling. To address this issue, we designed a micro-mould to enable controlled high-throughput formation of micro-aggregates. Morphology, stability, gene expression profiles and chondrogenic potential of micro-aggregates of human and bovine chondrocytes were evaluated and compared to single-cells cultured in micro-wells and in 3D after encapsulation in Dextran-Tyramine (Dex-TA) hydrogels in vitro and in vivo. We successfully formed micro-aggregates of human and bovine chondrocytes with highly controlled size, stability and viability within 24 hours. Micro-aggregates of 100 cells presented a superior balance in Collagen type I and Collagen type II gene expression over single cells and micro-aggregates of 50 and 200 cells. Matrix metalloproteinases 1, 9 and 13 mRNA levels were decreased in micro-aggregates compared to single-cells. Histological and biochemical analysis demonstrated enhanced matrix deposition in constructs seeded with micro-aggregates cultured in vitro and in vivo, compared to single-cell seeded constructs. Whole genome microarray analysis and single gene expression profiles using human chondrocytes confirmed increased expression of cartilage-related genes when chondrocytes were cultured in micro-aggregates. In conclusion, we succeeded in controlled high-throughput formation of micro-aggregates of chondrocytes. Compared to single cell-seeded constructs, seeding of constructs with micro-aggregates greatly improved neo-cartilage formation. Therefore, micro-aggregation prior to chondrocyte implantation in current MACI procedures, may effectively accelerate hyaline cartilage formation.

  17. High Throughput Screening of Valganciclovir in Acidic Microenvironments of Polyester Thin Films

    Directory of Open Access Journals (Sweden)

    Teilo Schaller

    2015-04-01

    Full Text Available Ganciclovir and valganciclor are antiviral agents used for the treatment of cytomegalovirus retinitis. The conventional method for administering ganciclovir in cytomegalovirus retinitis patients is repeated intravitreal injections. In order to obviate the possible detrimental effects of repeated intraocular injections, to improve compliance and to eliminate systemic side-effects, we investigated the tuning of the ganciclovir pro-drug valganciclovir and the release from thin films of poly(lactic-co-glycolic acid (PLGA, polycaprolactone (PCL, or mixtures of both, as a step towards prototyping periocular valganciclovir implants. To investigate the drug release, we established and evaluated a high throughput fluorescence-based quantification screening assay for the detection of valganciclovir. Our protocol allows quantifying as little as 20 ng of valganciclovir in 96-well polypropylene plates and a 50× faster analysis compared to traditional HPLC measurements. This improvement can hence be extrapolated to other polyester matrix thin film formulations using a high-throughput approach. The acidic microenvironment within the polyester matrix was found to protect valganciclovir from degradation with resultant increases in the half-life of the drug in the periocular implant to 100 days. Linear release profiles were obtained using the pure polyester polymers for 10 days and 60 days formulations; however, gross phase separations of PCL and acid-terminated PLGA prevented tuning within these timeframes due to the phase separation of the polymer, valganciclovir, or both.

  18. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  19. A cost-effectiveness analysis of two different antimicrobial stewardship programs.

    Science.gov (United States)

    Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia

    2016-01-01

    There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.

  20. A cost-effectiveness analysis of two different antimicrobial stewardship programs

    Directory of Open Access Journals (Sweden)

    Lucas Miyake Okumura

    2016-05-01

    Full Text Available There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70, however, it was more efficient (US$ 27,549.15 vs 29,011.46. Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective.

  1. Application of visual basic in high-throughput mass spectrometry-directed purification of combinatorial libraries.

    Science.gov (United States)

    Li, B; Chan, E C Y

    2003-01-01

    We present an approach to customize the sample submission process for high-throughput purification (HTP) of combinatorial parallel libraries using preparative liquid chromatography electrospray ionization mass spectrometry. In this study, Visual Basic and Visual Basic for Applications programs were developed using Microsoft Visual Basic 6 and Microsoft Excel 2000, respectively. These programs are subsequently applied for the seamless electronic submission and handling of data for HTP. Functions were incorporated into these programs where medicinal chemists can perform on-line verification of the purification status and on-line retrieval of postpurification data. The application of these user friendly and cost effective programs in our HTP technology has greatly increased our work efficiency by reducing paper work and manual manipulation of data.

  2. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  3. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    International Nuclear Information System (INIS)

    Hui Su

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm(sub 2) for 40-(micro)m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection

  4. High-throughput screening with micro-x-ray fluorescence

    International Nuclear Information System (INIS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-01-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity

  5. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Hong Kezhu

    2007-01-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  6. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Kezhu Hong

    2007-04-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  7. High-Throughput Fabrication Method for Producing a Silver-Nanoparticles-Doped Nanoclay Polymer Composite with Novel Synergistic Antibacterial Effects at the Material Interface.

    Science.gov (United States)

    Cai, Shaobo; Pourdeyhimi, Behnam; Loboa, Elizabeth G

    2017-06-28

    In this study, we report a high-throughput fabrication method at industrial pilot scale to produce a silver-nanoparticles-doped nanoclay-polylactic acid composite with a novel synergistic antibacterial effect. The obtained nanocomposite has a significantly lower affinity for bacterial adhesion, allowing the loading amount of silver nanoparticles to be tremendously reduced while maintaining satisfactory antibacterial efficacy at the material interface. This is a great advantage for many antibacterial applications in which cost is a consideration. Furthermore, unlike previously reported methods that require additional chemical reduction processes to produce the silver-nanoparticles-doped nanoclay, an in situ preparation method was developed in which silver nanoparticles were created simultaneously during the composite fabrication process by thermal reduction. This is the first report to show that altered material surface submicron structures created with the loading of nanoclay enables the creation of a nanocomposite with significantly lower affinity for bacterial adhesion. This study provides a promising scalable approach to produce antibacterial polymeric products with minimal changes to industry standard equipment, fabrication processes, or raw material input cost.

  8. A cost-effect analysis of an intervention against radon in homes

    Directory of Open Access Journals (Sweden)

    Hein Stigum

    2009-10-01

    Full Text Available Background  Key words  : Radon exposure, lung cancer, cost-effect analysis, attributable risk, models-mathematical: Radon is a radioactive gas that may leak into buildings from the ground. Radon exposure is a risk factor for lung cancer. An intervention against radon exposure in homes may consist of locating homes with high radon exposure (above 200 Bq m-3 and improving these, and of protecting future houses. The purpose of this paper is to calculate the costs and the effects of this intervention. Methods: We performed a cost-effect analysis from the perspective of the society, followed by an uncertainty and sensitivity analysis. The distribution of radon levels in Norwegian homes is lognormal with mean=74.5 Bq/m3, and 7.6% above 200 Bq/m3. Results: The preventable attributable fraction of radon on lung cancer was 3.8% (95% uncertainty interval: 0.6%, 8.3%. In cumulative present values the intervention would cost $238 (145, 310 million and save 892 (133, 1981 lives, each life saved costs $0.27 (0.09, 0.9 million. The cost-effect ratio was sensitive to the radon risk, the radon exposure distribution, and the latency period of lung cancer. Together these three parameters explained 90% of the variation in the cost-effect ratio. Conclusions: Reducing the radon concentration in present and future homes to below 200 Bq/m3 will cost $0.27 (0.09, 0.9 million per life saved. The uncertainty in the estimated cost per life is large, mainly due to uncertainty in the risk of lung cancer from radon. Based on estimates from road construction, the Norwegian society has been willing to pay $1 million to save a life. We therefore conclude that the intervention against radon in homes is justifiable. The willingness to pay is also larger that the upper uncertainty limit of the cost per life. Our conclusion is therefore robust against the uncertainties in the parameters.

  9. High-Throughput Screening of Chemical Effects on Steroidogenesis Using H295R Human Adrenocortical Carcinoma Cells.

    Science.gov (United States)

    Karmaus, Agnes L; Toole, Colleen M; Filer, Dayne L; Lewis, Kenneth C; Martin, Matthew T

    2016-04-01

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2060 chemical samples on steroidogenesis via high-performance liquid chromatography followed by tandem mass spectrometry quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a 3 stage screening strategy. The first stage established the maximum tolerated concentration (MTC; ≥ 70% viability) per sample. The second stage quantified changes in hormone levels at the MTC whereas the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were prestimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2060 chemical samples evaluated, 524 samples were selected for 6-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into 5 distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A distinct pattern was observed between imidazole and triazole fungicides suggesting potentially distinct mechanisms of action. From a chemical testing and prioritization perspective, this assay platform provides a robust model for high-throughput screening of chemicals for effects on steroidogenesis. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.

  10. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    Science.gov (United States)

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  11. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  12. High-throughput fragment screening by affinity LC-MS.

    Science.gov (United States)

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in 3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  13. 3D-SURFER: software for high-throughput protein surface comparison and analysis.

    Science.gov (United States)

    La, David; Esquivel-Rodríguez, Juan; Venkatraman, Vishwesh; Li, Bin; Sael, Lee; Ueng, Stephen; Ahrendt, Steven; Kihara, Daisuke

    2009-11-01

    We present 3D-SURFER, a web-based tool designed to facilitate high-throughput comparison and characterization of proteins based on their surface shape. As each protein is effectively represented by a vector of 3D Zernike descriptors, comparison times for a query protein against the entire PDB take, on an average, only a couple of seconds. The web interface has been designed to be as interactive as possible with displays showing animated protein rotations, CATH codes and structural alignments using the CE program. In addition, geometrically interesting local features of the protein surface, such as pockets that often correspond to ligand binding sites as well as protrusions and flat regions can also be identified and visualized. 3D-SURFER is a web application that can be freely accessed from: http://dragon.bio.purdue.edu/3d-surfer dkihara@purdue.edu Supplementary data are available at Bioinformatics online.

  14. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  15. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  16. Cost and effectiveness analysis on unmanned aerial vehicle (UAV) use at border security

    Science.gov (United States)

    Yilmaz, Bahadır.

    2013-06-01

    Drones and Remotely Piloted Vehicles are types of Unmanned Aerial Vehicles. UAVs began to be used with the war of Vietnam, they had a great interest when Israel used them in Bekaa Valley Operations of 1982. UAVs have been used by different countries with different aims with the help of emerging technology and investments. In this article, in the context of areas of UAV usage in national security, benefits and disadvantages of UAVs are put forward. Particularly, it has been evaluated on the basis of cost-effectiveness by focusing the use of UAV in the border security. UAVs have been studied by taking cost analysis, procurement and operational costs into consideration. Analysis of effectiveness has been done with illegal passages of people and drugs from flight times of UAVs. Although the procurement cost of the medium-level UAVs is low, its operational costs are high. For this reason, the idea of less costly alternative systems have been revealed for the border security. As the costs are reduced to acceptable level involving national security and border security in future with high-technology products in their structure, it will continue to be used in an increasing proportion.

  17. High-throughput theoretical design of lithium battery materials

    International Nuclear Information System (INIS)

    Ling Shi-Gang; Gao Jian; Xiao Rui-Juan; Chen Li-Quan

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. (topical review)

  18. Optical tools for high-throughput screening of abrasion resistance of combinatorial libraries of organic coatings

    Science.gov (United States)

    Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.

    2002-02-01

    Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.

  19. 78 FR 6140 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2013-01-29

    ... OFFICE OF MANAGEMENT AND BUDGET Discount Rates for Cost-Effectiveness Analysis of Federal Programs... in Appendix C are to be used for cost-effectiveness analysis, including lease-purchase analysis, as...) Discount Rates for Cost-Effectiveness, Lease Purchase, and Related Analyses Effective Dates. This appendix...

  20. GROMACS 4.5: A high-throughput and highly parallel open source molecular simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Pronk, Sander [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Pall, Szilard [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Schulz, Roland [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Larsson, Per [Univ. of Virginia, Charlottesville, VA (United States); Bjelkmar, Par [Science for Life Lab., Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden); Apostolov, Rossen [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Shirts, Michael R. [Univ. of Virginia, Charlottesville, VA (United States); Smith, Jeremy C. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kasson, Peter M. [Univ. of Virginia, Charlottesville, VA (United States); van der Spoel, David [Science for Life Lab., Stockholm (Sweden); Uppsala Univ., Uppsala (Sweden); Hess, Berk [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Lindahl, Erik [Science for Life Lab., Stockholm (Sweden); KTH Royal Institute of Technology, Stockholm (Sweden); Stockholm Univ., Stockholm (Sweden)

    2013-02-13

    In this study, molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. As a result, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations.

  1. Clinical validation of an ultra high-throughput spiral microfluidics for the detection and enrichment of viable circulating tumor cells.

    Directory of Open Access Journals (Sweden)

    Bee Luan Khoo

    Full Text Available Circulating tumor cells (CTCs are cancer cells that can be isolated via liquid biopsy from blood and can be phenotypically and genetically characterized to provide critical information for guiding cancer treatment. Current analysis of CTCs is hindered by the throughput, selectivity and specificity of devices or assays used in CTC detection and isolation.Here, we enriched and characterized putative CTCs from blood samples of patients with both advanced stage metastatic breast and lung cancers using a novel multiplexed spiral microfluidic chip. This system detected putative CTCs under high sensitivity (100%, n = 56 (Breast cancer samples: 12-1275 CTCs/ml; Lung cancer samples: 10-1535 CTCs/ml rapidly from clinically relevant blood volumes (7.5 ml under 5 min. Blood samples were completely separated into plasma, CTCs and PBMCs components and each fraction were characterized with immunophenotyping (Pan-cytokeratin/CD45, CD44/CD24, EpCAM, fluorescence in-situ hybridization (FISH (EML4-ALK or targeted somatic mutation analysis. We used an ultra-sensitive mass spectrometry based system to highlight the presence of an EGFR-activating mutation in both isolated CTCs and plasma cell-free DNA (cf-DNA, and demonstrate concordance with the original tumor-biopsy samples.We have clinically validated our multiplexed microfluidic chip for the ultra high-throughput, low-cost and label-free enrichment of CTCs. Retrieved cells were unlabeled and viable, enabling potential propagation and real-time downstream analysis using next generation sequencing (NGS or proteomic analysis.

  2. High-Throughput Near-Field Optical Nanoprocessing of Solution-Deposited Nanoparticles

    KAUST Repository

    Pan, Heng

    2010-07-27

    The application of nanoscale electrical and biological devices will benefit from the development of nanomanufacturing technologies that are highthroughput, low-cost, and flexible. Utilizing nanomaterials as building blocks and organizing them in a rational way constitutes an attractive approach towards this goal and has been pursued for the past few years. The optical near-field nanoprocessing of nanoparticles for high-throughput nanomanufacturing is reported. The method utilizes fluidically assembled microspheres as a near-field optical confinement structure array for laserassisted nanosintering and nanoablation of nanoparticles. By taking advantage of the low processing temperature and reduced thermal diffusion in the nanoparticle film, a minimum feature size down to ≈i100nm is realized. In addition, smaller features (50nm) are obtained by furnace annealing of laser-sintered nanodots at 400 °C. The electrical conductivity of sintered nanolines is also studied. Using nanoline electrodes separated by a submicrometer gap, organic field-effect transistors are subsequently fabricated with oxygen-stable semiconducting polymer. © 2010 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim.

  3. High-Throughput Block Optical DNA Sequence Identification.

    Science.gov (United States)

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  5. Alignment of time-resolved data from high throughput experiments.

    Science.gov (United States)

    Abidi, Nada; Franke, Raimo; Findeisen, Peter; Klawonn, Frank

    2016-12-01

    To better understand the dynamics of the underlying processes in cells, it is necessary to take measurements over a time course. Modern high-throughput technologies are often used for this purpose to measure the behavior of cell products like metabolites, peptides, proteins, [Formula: see text]RNA or mRNA at different points in time. Compared to classical time series, the number of time points is usually very limited and the measurements are taken at irregular time intervals. The main reasons for this are the costs of the experiments and the fact that the dynamic behavior usually shows a strong reaction and fast changes shortly after a stimulus and then slowly converges to a certain stable state. Another reason might simply be missing values. It is common to repeat the experiments and to have replicates in order to carry out a more reliable analysis. The ideal assumptions that the initial stimulus really started exactly at the same time for all replicates and that the replicates are perfectly synchronized are seldom satisfied. Therefore, there is a need to first adjust or align the time-resolved data before further analysis is carried out. Dynamic time warping (DTW) is considered as one of the common alignment techniques for time series data with equidistant time points. In this paper, we modified the DTW algorithm so that it can align sequences with measurements at different, non-equidistant time points with large gaps in between. This type of data is usually known as time-resolved data characterized by irregular time intervals between measurements as well as non-identical time points for different replicates. This new algorithm can be easily used to align time-resolved data from high-throughput experiments and to come across existing problems such as time scarcity and existing noise in the measurements. We propose a modified method of DTW to adapt requirements imposed by time-resolved data by use of monotone cubic interpolation splines. Our presented approach

  6. Cost-minimization analysis: radiation treatment with and without a multi-leaf collimator

    International Nuclear Information System (INIS)

    Foroudi, Farshad; Lapsley, Helen; Manderson, Christine; Yeghiaian-Alvandi, Roland

    2000-01-01

    Purpose: To compare the costs of radiation treatment on a linear accelerator with a multileaf collimator (MLC) versus treatment on a linear accelerator without an MLC. The study was designed to determine whether the increased throughput of fields and decreased block cutting made the MLC cost effective from an institutional perspective. Methods and Materials: The number of fields, basic treatment equivalent, equivalent simple treatment visits, and blocks were prospectively collected for the four linear accelerators. Building, equipment, staffing, and service costs were all obtained in 1999 Australian dollars from the manufacturers and hospital department heads. The Joint Radiation Oncology Centre at Westmead and Nepean Hospitals, which are Australian public hospitals, runs as one unit, with the same staff, and currently operates five linear accelerators. Currently, four of the linear accelerators are used for general radiotherapy, operating for exactly the same hours; the final machine operates more limited hours and is used for specialized radiotherapy techniques and emergency cases. Results: The two machines with MLCs, on average, treated 5,169 fields each, while the two machines without MLCs treated 4,543 fields in a 3-month period, a 12% increase in throughput. The two non-MLC machines required 155 premounted trays (PMTs) in total, while the MLC machines required 17 PMTs. Linear accelerators with MLCs were demonstrably more efficient, and while their capital costs were higher, the reduction in labor costs associated with block cutting and, particularly the increased throughput, more than offset these initial costs. The total cost of a radiation field with an MLC was found to be $A101.69 compared to $A106.98 without an MLC. A multiway sensitivity analysis showed the results to be robust. The worst-case scenario was a departmental savings of $A168,000 per year; the best-case scenario was a savings of $A680,000 per year. Conclusion: Under the conditions pertaining

  7. High throughput salt separation from uranium deposits

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, S.W.; Park, K.M.; Kim, J.G.; Kim, I.T.; Park, S.B., E-mail: swkwon@kaeri.re.kr [Korea Atomic Energy Research Inst. (Korea, Republic of)

    2014-07-01

    It is very important to increase the throughput of the salt separation system owing to the high uranium content of spent nuclear fuel and high salt fraction of uranium dendrites in pyroprocessing. Multilayer porous crucible system was proposed to increase a throughput of the salt distiller in this study. An integrated sieve-crucible assembly was also investigated for the practical use of the porous crucible system. The salt evaporation behaviors were compared between the conventional nonporous crucible and the porous crucible. Two step weight reductions took place in the porous crucible, whereas the salt weight reduced only at high temperature by distillation in a nonporous crucible. The first weight reduction in the porous crucible was caused by the liquid salt penetrated out through the perforated crucible during the temperature elevation until the distillation temperature. Multilayer porous crucibles have a benefit to expand the evaporation surface area. (author)

  8. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    Science.gov (United States)

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  9. 77 FR 1743 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2012-01-11

    ... OFFICE OF MANAGEMENT AND BUDGET Discount Rates for Cost-Effectiveness Analysis of Federal Programs... Appendix C are to be used for cost-effectiveness analysis, including lease-purchase analysis, as specified... (Revised December 2011) Discount Rates for Cost-Effectiveness, Lease Purchase, and Related Analyses...

  10. 76 FR 7881 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2011-02-11

    ... OFFICE OF MANAGEMENT AND BUDGET Discount Rates for Cost-Effectiveness Analysis of Federal Programs... Appendix C are to be used for cost-effectiveness analysis, including lease-purchase analysis, as specified... (Revised December 2010) DISCOUNT RATES FOR COST-EFFECTIVENESS, LEASE PURCHASE, AND RELATED ANALYSES...

  11. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  12. High throughput, low set-up time reconfigurable linear feedback shift registers

    NARCIS (Netherlands)

    Nas, R.J.M.; Berkel, van C.H.

    2010-01-01

    This paper presents a hardware design for a scalable, high throughput, configurable LFSR. High throughput is achieved by producing L consecutive outputs per clock cycle with a clock cycle period that, for practical cases, increases only logarithmically with the block size L and the length of the

  13. High throughput and accurate serum proteome profiling by integrated sample preparation technology and single-run data independent mass spectrometry analysis.

    Science.gov (United States)

    Lin, Lin; Zheng, Jiaxin; Yu, Quan; Chen, Wendong; Xing, Jinchun; Chen, Chenxi; Tian, Ruijun

    2018-03-01

    Mass spectrometry (MS)-based serum proteome analysis is extremely challenging due to its high complexity and dynamic range of protein abundances. Developing high throughput and accurate serum proteomic profiling approach capable of analyzing large cohorts is urgently needed for biomarker discovery. Herein, we report a streamlined workflow for fast and accurate proteomic profiling from 1μL of blood serum. The workflow combined an integrated technique for highly sensitive and reproducible sample preparation and a new data-independent acquisition (DIA)-based MS method. Comparing with standard data dependent acquisition (DDA) approach, the optimized DIA method doubled the number of detected peptides and proteins with better reproducibility. Without protein immunodepletion and prefractionation, the single-run DIA analysis enables quantitative profiling of over 300 proteins with 50min gradient time. The quantified proteins span more than five orders of magnitude of abundance range and contain over 50 FDA-approved disease markers. The workflow allowed us to analyze 20 serum samples per day, with about 358 protein groups per sample being identified. A proof-of-concept study on renal cell carcinoma (RCC) serum samples confirmed the feasibility of the workflow for large scale serum proteomic profiling and disease-related biomarker discovery. Blood serum or plasma is the predominant specimen for clinical proteomic studies while the analysis is extremely challenging for its high complexity. Many efforts had been made in the past for serum proteomics for maximizing protein identifications, whereas few have been concerned with throughput and reproducibility. Here, we establish a rapid, robust and high reproducible DIA-based workflow for streamlined serum proteomic profiling from 1μL serum. The workflow doesn't need protein depletion and pre-fractionation, while still being able to detect disease-relevant proteins accurately. The workflow is promising in clinical application

  14. The cost effectiveness of pandemic influenza interventions: a pandemic severity based analysis.

    Directory of Open Access Journals (Sweden)

    George J Milne

    Full Text Available BACKGROUND: The impact of a newly emerged influenza pandemic will depend on its transmissibility and severity. Understanding how these pandemic features impact on the effectiveness and cost effectiveness of alternative intervention strategies is important for pandemic planning. METHODS: A cost effectiveness analysis of a comprehensive range of social distancing and antiviral drug strategies intended to mitigate a future pandemic was conducted using a simulation model of a community of ∼30,000 in Australia. Six pandemic severity categories were defined based on case fatality ratio (CFR, using data from the 2009/2010 pandemic to relate hospitalisation rates to CFR. RESULTS: Intervention strategies combining school closure with antiviral treatment and prophylaxis are the most cost effective strategies in terms of cost per life year saved (LYS for all severity categories. The cost component in the cost per LYS ratio varies depending on pandemic severity: for a severe pandemic (CFR of 2.5% the cost is ∼$9 k per LYS; for a low severity pandemic (CFR of 0.1% this strategy costs ∼$58 k per LYS; for a pandemic with very low severity similar to the 2009 pandemic (CFR of 0.03% the cost is ∼$155 per LYS. With high severity pandemics (CFR >0.75% the most effective attack rate reduction strategies are also the most cost effective. During low severity pandemics costs are dominated by productivity losses due to illness and social distancing interventions, while for high severity pandemics costs are dominated by hospitalisation costs and productivity losses due to death. CONCLUSIONS: The most cost effective strategies for mitigating an influenza pandemic involve combining sustained social distancing with the use of antiviral agents. For low severity pandemics the most cost effective strategies involve antiviral treatment, prophylaxis and short durations of school closure; while these are cost effective they are less effective than other strategies in

  15. The Cost Effectiveness of Pandemic Influenza Interventions: A Pandemic Severity Based Analysis

    Science.gov (United States)

    Milne, George J.; Halder, Nilimesh; Kelso, Joel K.

    2013-01-01

    Background The impact of a newly emerged influenza pandemic will depend on its transmissibility and severity. Understanding how these pandemic features impact on the effectiveness and cost effectiveness of alternative intervention strategies is important for pandemic planning. Methods A cost effectiveness analysis of a comprehensive range of social distancing and antiviral drug strategies intended to mitigate a future pandemic was conducted using a simulation model of a community of ∼30,000 in Australia. Six pandemic severity categories were defined based on case fatality ratio (CFR), using data from the 2009/2010 pandemic to relate hospitalisation rates to CFR. Results Intervention strategies combining school closure with antiviral treatment and prophylaxis are the most cost effective strategies in terms of cost per life year saved (LYS) for all severity categories. The cost component in the cost per LYS ratio varies depending on pandemic severity: for a severe pandemic (CFR of 2.5%) the cost is ∼$9 k per LYS; for a low severity pandemic (CFR of 0.1%) this strategy costs ∼$58 k per LYS; for a pandemic with very low severity similar to the 2009 pandemic (CFR of 0.03%) the cost is ∼$155 per LYS. With high severity pandemics (CFR >0.75%) the most effective attack rate reduction strategies are also the most cost effective. During low severity pandemics costs are dominated by productivity losses due to illness and social distancing interventions, while for high severity pandemics costs are dominated by hospitalisation costs and productivity losses due to death. Conclusions The most cost effective strategies for mitigating an influenza pandemic involve combining sustained social distancing with the use of antiviral agents. For low severity pandemics the most cost effective strategies involve antiviral treatment, prophylaxis and short durations of school closure; while these are cost effective they are less effective than other strategies in reducing the

  16. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  17. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    Science.gov (United States)

    Pinto, Nicolas; Doukhan, David; DiCarlo, James J; Cox, David D

    2009-11-01

    While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit) is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor). In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  18. A high-throughput screening approach to discovering good forms of biologically inspired visual representation.

    Directory of Open Access Journals (Sweden)

    Nicolas Pinto

    2009-11-01

    Full Text Available While many models of biological object recognition share a common set of "broad-stroke" properties, the performance of any one model depends strongly on the choice of parameters in a particular instantiation of that model--e.g., the number of units per layer, the size of pooling kernels, exponents in normalization operations, etc. Since the number of such parameters (explicit or implicit is typically large and the computational cost of evaluating one particular parameter set is high, the space of possible model instantiations goes largely unexplored. Thus, when a model fails to approach the abilities of biological visual systems, we are left uncertain whether this failure is because we are missing a fundamental idea or because the correct "parts" have not been tuned correctly, assembled at sufficient scale, or provided with enough training. Here, we present a high-throughput approach to the exploration of such parameter sets, leveraging recent advances in stream processing hardware (high-end NVIDIA graphic cards and the PlayStation 3's IBM Cell Processor. In analogy to high-throughput screening approaches in molecular biology and genetics, we explored thousands of potential network architectures and parameter instantiations, screening those that show promising object recognition performance for further analysis. We show that this approach can yield significant, reproducible gains in performance across an array of basic object recognition tasks, consistently outperforming a variety of state-of-the-art purpose-built vision systems from the literature. As the scale of available computational power continues to expand, we argue that this approach has the potential to greatly accelerate progress in both artificial vision and our understanding of the computational underpinning of biological vision.

  19. Comparing the normalization methods for the differential analysis of Illumina high-throughput RNA-Seq data.

    Science.gov (United States)

    Li, Peipei; Piao, Yongjun; Shon, Ho Sun; Ryu, Keun Ho

    2015-10-28

    Recently, rapid improvements in technology and decrease in sequencing costs have made RNA-Seq a widely used technique to quantify gene expression levels. Various normalization approaches have been proposed, owing to the importance of normalization in the analysis of RNA-Seq data. A comparison of recently proposed normalization methods is required to generate suitable guidelines for the selection of the most appropriate approach for future experiments. In this paper, we compared eight non-abundance (RC, UQ, Med, TMM, DESeq, Q, RPKM, and ERPKM) and two abundance estimation normalization methods (RSEM and Sailfish). The experiments were based on real Illumina high-throughput RNA-Seq of 35- and 76-nucleotide sequences produced in the MAQC project and simulation reads. Reads were mapped with human genome obtained from UCSC Genome Browser Database. For precise evaluation, we investigated Spearman correlation between the normalization results from RNA-Seq and MAQC qRT-PCR values for 996 genes. Based on this work, we showed that out of the eight non-abundance estimation normalization methods, RC, UQ, Med, TMM, DESeq, and Q gave similar normalization results for all data sets. For RNA-Seq of a 35-nucleotide sequence, RPKM showed the highest correlation results, but for RNA-Seq of a 76-nucleotide sequence, least correlation was observed than the other methods. ERPKM did not improve results than RPKM. Between two abundance estimation normalization methods, for RNA-Seq of a 35-nucleotide sequence, higher correlation was obtained with Sailfish than that with RSEM, which was better than without using abundance estimation methods. However, for RNA-Seq of a 76-nucleotide sequence, the results achieved by RSEM were similar to without applying abundance estimation methods, and were much better than with Sailfish. Furthermore, we found that adding a poly-A tail increased alignment numbers, but did not improve normalization results. Spearman correlation analysis revealed that RC, UQ

  20. WE-E-BRE-07: High-Throughput Mapping of Proton Biologic Effect

    Energy Technology Data Exchange (ETDEWEB)

    Bronk, L; Guan, F; Kerr, M; Dinh, J; Titt, U; Mirkovic, D; Lin, S; Mohan, R; Grosshans, D [UT MD Anderson Cancer Center, Houston, TX (United States)

    2014-06-15

    Purpose: To systematically relate the relative biological effectives (RBE) of proton therapy to beam linear energy transfer (LET) and dose. Methods: Using a custom irradiation apparatus previously characterized by our group, H460 NSCLCs were irradiated using a clinical 80MeV spot scanning proton beam. Utilizing this system allowed for high-throughput clonogenic assays performed in 96-well tissue culture plates as opposed to the traditional 6-well technique. Each column in the 96-well plate received a set LET-dose combination. By altering the total number of dose repaintings, numerous dose-LET configurations were examined to effectively generate surviving fraction (SF) data over the entire Bragg peak. The clonogenic assay was performed post-irradiation using an INCell Analyzer for colony quantification. SF data were fit to the linear-quadratic model for analysis. Results: Irradiation with increasing LETs resulted in decreased cell survival largely independent of dose. A significant correlation between LET and SF was identified by two-way ANOVA and the extra sum-of-squares F test. This trend was obscured at the lower LET values in the plateau region of the Bragg peak; however, it was clear for LET values at and beyond the Bragg peak. Data fits revealed the SF at a dose of 2Gy (SF2) to be 0.48 for the lowest tested LET (1.55keV/um), 0.47 at the end of the plateau region (4.74keV/um) and 0.33 for protons at the Bragg peak (10.35keV/um). Beyond the Bragg peak we measured SF2s of 0.16 for 15.01keV/um, 0.02 for 16.79keV/um, and 0.004 for 18.06keV/um. Conclusion: We have shown that our methodology enables high-content automated screening for proton irradiations over a range of LETs. The observed decrease in cellular SF in high LET regions confirms an increased RBE of the radiation and suggests further evaluation of proton RBE values is necessary to optimize clinical outcomes. Rosalie B. Hite Graduate Fellowship in Cancer Research, NIH Program Project Grant P01CA021239.

  1. High-throughput quantitative biochemical characterization of algal biomass by NIR spectroscopy; multiple linear regression and multivariate linear regression analysis.

    Science.gov (United States)

    Laurens, L M L; Wolfrum, E J

    2013-12-18

    One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.

  2. Beyond cost-effectiveness: Using systems analysis for infectious disease preparedness.

    Science.gov (United States)

    Phelps, Charles; Madhavan, Guruprasad; Rappuoli, Rino; Colwell, Rita; Fineberg, Harvey

    2017-01-20

    Until the recent outbreaks, Ebola vaccines ranked low in decision makers' priority lists based on cost-effectiveness analysis and (or) corporate profitability. Despite a relatively small number of Ebola-related cases and deaths (compared to other causes), Ebola vaccines suddenly leapt to highest priority among international health agencies and vaccine developers. Clearly, earlier cost-effectiveness analyses badly missed some factors affecting real world decisions. Multi-criteria systems analysis can improve evaluation and prioritization of vaccine development and also of many other health policy and investment decisions. Neither cost-effectiveness nor cost-benefit analysis can capture important aspects of problems such as Ebola or the emerging threat of Zika, especially issues of inequality and disparity-issues that dominate the planning of many global health and economic organizations. Cost-benefit analysis requires assumptions about the specific value of life-an idea objectionable to many analysts and policy makers. Additionally, standard cost-effectiveness calculations cannot generally capture effects on people uninfected with Ebola for example, but nevertheless affected through such factors as contagion, herd immunity, and fear of dread disease, reduction of travel and commerce, and even the hope of disease eradication. Using SMART Vaccines, we demonstrate how systems analysis can visibly include important "other factors" and more usefully guide decision making and beneficially alter priority setting processes. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Cost-effectiveness analysis of pneumococcal vaccination for infants in China.

    Science.gov (United States)

    Maurer, Kristin A; Chen, Huey-Fen; Wagner, Abram L; Hegde, Sonia T; Patel, Tejasi; Boulton, Matthew L; Hutton, David W

    2016-12-07

    Although China has a high burden of pneumococcal disease among young children, the government does not administer publicly-funded pneumococcal conjugate vaccines (PCV) through its Expanded Program on Immunization (EPI). We evaluated the cost-effectiveness of publicly-funded PCV-7, PCV-10, and PCV-13 vaccination programs for infants in China. Using a Markov model, we simulated a cohort of 16 million Chinese infants to estimate the impact of PCV-7, PCV-10, and PCV-13 vaccination programs from a societal perspective. We extrapolated health states to estimate the effects of the programs over the course of a lifetime of 75years. Parameters in the model were derived from a review of the literature. We found that PCV-7, PCV-10, and PCV-13 vaccination programs would be cost-effective compared to no vaccination. However, PCV-13 had the lowest incremental cost-effectiveness ratio ($11,464/QALY vs $16,664/QALY for PCV-10 and $18,224/QALY for PCV-7) due to a reduction in overall costs. Our sensitivity analysis revealed that the incremental cost-effectiveness ratios were most sensitive to the utility of acute otitis media, the cost of PCV-13, and the incidence of pneumonia and acute otitis media. The Chinese government should take steps to reduce the burden of pneumococcal diseases among young children through the inclusion of a pneumococcal conjugate vaccine in its EPI. Although all vaccinations would be cost-effective, PCV-13 would save more costs to the healthcare system and would be the preferred strategy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Cost analysis and exploratory cost-effectiveness of youth-friendly sexual and reproductive health services in the Republic of Moldova

    Science.gov (United States)

    2014-01-01

    Background Youth-friendly sexual and reproductive health services (YFHS) have high priority in many countries. Yet, little is known about the cost and cost-effectiveness of good quality YFHS in resource limited settings. This paper analyses retrospectively costs and potential cost-effectiveness of four well performing youth-friendly health centres (YFHC) in Moldova. This study assesses: (1) what were the costs of YFHSs at centre level, (2) how much would scaling-up to a national good quality YFHS programme cost, and (3) was the programme potentially cost-effective? Methods Four well performing YFHCs were selected for the study. YFHS costs were analysed per centre, funding source, service and person reached. The costing results were extrapolated to estimate cost of a good quality national YFHS programme in Moldova. A threshold analysis was carried out to estimate the required impact level for the YFHSs to break-even (become cost saving). Results Average annual cost of a well performing YFHC was USD 26,000 in 2011. 58% was financed by the National Health Insurance Company and the rest by external donors (42%). Personnel salaries were the largest expense category (47%). The annual implementation costs of a good quality YFHSs in all 38 YFHCs of Moldova were estimated to be USD 1.0 million. The results of the threshold analysis indicate that the annual break-even impact points in a YFHC for: 1) STI services would be >364 averted STIs, 2) early pregnancy and contraceptive services >178 averted unwanted pregnancies, and 3) HIV services only >0.65 averted new HIV infections. Conclusions The costing results highlight the following: 1) significant additional resources would be required for implementation of a good quality national YFHS programme, 2) the four well performing YFHCs rely heavily on external funding (42%), 3) which raises questions about financial sustainability of the programme. At the same time results of the threshold analysis are encouraging. The result

  5. Cost analysis and exploratory cost-effectiveness of youth-friendly sexual and reproductive health services in the Republic of Moldova.

    Science.gov (United States)

    Kempers, Jari; Ketting, Evert; Lesco, Galina

    2014-07-21

    Youth-friendly sexual and reproductive health services (YFHS) have high priority in many countries. Yet, little is known about the cost and cost-effectiveness of good quality YFHS in resource limited settings. This paper analyses retrospectively costs and potential cost-effectiveness of four well performing youth-friendly health centres (YFHC) in Moldova. This study assesses: (1) what were the costs of YFHSs at centre level, (2) how much would scaling-up to a national good quality YFHS programme cost, and (3) was the programme potentially cost-effective? Four well performing YFHCs were selected for the study. YFHS costs were analysed per centre, funding source, service and person reached. The costing results were extrapolated to estimate cost of a good quality national YFHS programme in Moldova. A threshold analysis was carried out to estimate the required impact level for the YFHSs to break-even (become cost saving). Average annual cost of a well performing YFHC was USD 26,000 in 2011. 58% was financed by the National Health Insurance Company and the rest by external donors (42%). Personnel salaries were the largest expense category (47%). The annual implementation costs of a good quality YFHSs in all 38 YFHCs of Moldova were estimated to be USD 1.0 million. The results of the threshold analysis indicate that the annual break-even impact points in a YFHC for: 1) STI services would be >364 averted STIs, 2) early pregnancy and contraceptive services >178 averted unwanted pregnancies, and 3) HIV services only >0.65 averted new HIV infections. The costing results highlight the following: 1) significant additional resources would be required for implementation of a good quality national YFHS programme, 2) the four well performing YFHCs rely heavily on external funding (42%), 3) which raises questions about financial sustainability of the programme. At the same time results of the threshold analysis are encouraging. The result suggest that, together the three SRH

  6. A continuous high-throughput bioparticle sorter based on 3D traveling-wave dielectrophoresis.

    Science.gov (United States)

    Cheng, I-Fang; Froude, Victoria E; Zhu, Yingxi; Chang, Hsueh-Chia; Chang, Hsien-Chang

    2009-11-21

    We present a high throughput (maximum flow rate approximately 10 microl/min or linear velocity approximately 3 mm/s) continuous bio-particle sorter based on 3D traveling-wave dielectrophoresis (twDEP) at an optimum AC frequency of 500 kHz. The high throughput sorting is achieved with a sustained twDEP particle force normal to the continuous through-flow, which is applied over the entire chip by a single 3D electrode array. The design allows continuous fractionation of micron-sized particles into different downstream sub-channels based on differences in their twDEP mobility on both sides of the cross-over. Conventional DEP is integrated upstream to focus the particles into a single levitated queue to allow twDEP sorting by mobility difference and to minimize sedimentation and field-induced lysis. The 3D electrode array design minimizes the offsetting effect of nDEP (negative DEP with particle force towards regions with weak fields) on twDEP such that both forces increase monotonically with voltage to further increase the throughput. Effective focusing and separation of red blood cells from debris-filled heterogeneous samples are demonstrated, as well as size-based separation of poly-dispersed liposome suspensions into two distinct bands at 2.3 to 4.6 microm and 1.5 to 2.7 microm, at the highest throughput recorded in hand-held chips of 6 microl/min.

  7. A New High-Throughput Approach to Genotype Ancient Human Gastrointestinal Parasites.

    Science.gov (United States)

    Côté, Nathalie M L; Daligault, Julien; Pruvost, Mélanie; Bennett, E Andrew; Gorgé, Olivier; Guimaraes, Silvia; Capelli, Nicolas; Le Bailly, Matthieu; Geigl, Eva-Maria; Grange, Thierry

    2016-01-01

    Human gastrointestinal parasites are good indicators for hygienic conditions and health status of past and present individuals and communities. While microscopic analysis of eggs in sediments of archeological sites often allows their taxonomic identification, this method is rarely effective at the species level, and requires both the survival of intact eggs and their proper identification. Genotyping via PCR-based approaches has the potential to achieve a precise species-level taxonomic determination. However, so far it has mostly been applied to individual eggs isolated from archeological samples. To increase the throughput and taxonomic accuracy, as well as reduce costs of genotyping methods, we adapted a PCR-based approach coupled with next-generation sequencing to perform precise taxonomic identification of parasitic helminths directly from archeological sediments. Our study of twenty-five 100 to 7,200 year-old archeological samples proved this to be a powerful, reliable and efficient approach for species determination even in the absence of preserved eggs, either as a stand-alone method or as a complement to microscopic studies.

  8. High Throughput In Situ XAFS Screening of Catalysts

    International Nuclear Information System (INIS)

    Tsapatsaris, Nikolaos; Beesley, Angela M.; Weiher, Norbert; Tatton, Helen; Schroeder, Sven L. M.; Dent, Andy J.; Mosselmans, Frederick J. W.; Tromp, Moniek; Russu, Sergio; Evans, John; Harvey, Ian; Hayama, Shu

    2007-01-01

    We outline and demonstrate the feasibility of high-throughput (HT) in situ XAFS for synchrotron radiation studies. An XAS data acquisition and control system for the analysis of dynamic materials libraries under control of temperature and gaseous environments has been developed. The system is compatible with the 96-well industry standard and coupled to multi-stream quadrupole mass spectrometry (QMS) analysis of reactor effluents. An automated analytical workflow generates data quickly compared to traditional individual spectrum acquisition and analyses them in quasi-real time using an HT data analysis tool based on IFFEFIT. The system was used for the automated characterization of a library of 91 catalyst precursors containing ternary combinations of Cu, Pt, and Au on γ-Al2O3, and for the in situ characterization of Au catalysts supported on Al2O3 and TiO2

  9. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    Science.gov (United States)

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  10. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    Energy Technology Data Exchange (ETDEWEB)

    Harding, Louisa B. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Schultz, Irvin R. [Battelle, Marine Sciences Laboratory – Pacific Northwest National Laboratory, 1529 West Sequim Bay Road, Sequim, WA 98382 (United States); Goetz, Giles W. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Luckenbach, J. Adam [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Young, Graham [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Goetz, Frederick W. [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, Manchester Research Station, P.O. Box 130, Manchester, WA 98353 (United States); Swanson, Penny, E-mail: penny.swanson@noaa.gov [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States)

    2013-10-15

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina{sup ®} sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  11. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    International Nuclear Information System (INIS)

    Harding, Louisa B.; Schultz, Irvin R.; Goetz, Giles W.; Luckenbach, J. Adam; Young, Graham; Goetz, Frederick W.; Swanson, Penny

    2013-01-01

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina ® sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  12. 20180311 - Differential Gene Expression and Concentration-Response Modeling Workflow for High-Throughput Transcriptomic (HTTr) Data: Results From MCF7 Cells (SOT)

    Science.gov (United States)

    Increasing efficiency and declining cost of generating whole transcriptome profiles has made high-throughput transcriptomics a practical option for chemical bioactivity screening. The resulting data output provides information on the expression of thousands of genes and is amenab...

  13. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    International Nuclear Information System (INIS)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2014-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication

  14. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States); Gagnon, David [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Gjoerup, Ole [Molecular Oncology Research Institute, Tufts Medical Center, Boston, MA 02111 (United States); Archambault, Jacques [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Bullock, Peter A., E-mail: Peter.Bullock@tufts.edu [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States)

    2014-11-15

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication.

  15. Design of a High-Throughput Biological Crystallography Beamline for Superconducting Wiggler

    International Nuclear Information System (INIS)

    Tseng, P.C.; Chang, C.H.; Fung, H.S.; Ma, C.I.; Huang, L.J.; Jean, Y.C.; Song, Y.F.; Huang, Y.S.; Tsang, K.L.; Chen, C.T.

    2004-01-01

    We are constructing a high-throughput biological crystallography beamline BL13B, which utilizes the radiation generated from a 3.2 Tesla, 32-pole superconducting multipole wiggler, for multi-wavelength anomalous diffraction (MAD), single-wavelength anomalous diffraction (SAD), and other related experiments. This beamline is a standard double crystal monochromator (DCM) x-ray beamline equipped with a collimating mirror (CM) and a focusing mirror (FM). Both the CM and FM are one meter long and made of Si substrate, and the CM is side-cooled by water. Based on detailed thermal analysis, liquid nitrogen (LN2) cooling for both crystals of the DCM has been adopted to optimize the energy resolution and photon beam throughput. This beamline will deliver, through a 100 μm diameter pinhole, photon flux of greater than 1011 photons/sec in the energy range from 6.5 keV to 19 keV, which is comparable to existing protein crystallography beamlines from bending magnet source at high energy storage rings

  16. Green Infrastructure Siting and Cost Effectiveness Analysis

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Parcel scale green infrastructure siting and cost effectiveness analysis. You can find more details at the project's website.

  17. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    Science.gov (United States)

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. ExSTA: External Standard Addition Method for Accurate High-Throughput Quantitation in Targeted Proteomics Experiments.

    Science.gov (United States)

    Mohammed, Yassene; Pan, Jingxi; Zhang, Suping; Han, Jun; Borchers, Christoph H

    2018-03-01

    Targeted proteomics using MRM with stable-isotope-labeled internal-standard (SIS) peptides is the current method of choice for protein quantitation in complex biological matrices. Better quantitation can be achieved with the internal standard-addition method, where successive increments of synthesized natural form (NAT) of the endogenous analyte are added to each sample, a response curve is generated, and the endogenous concentration is determined at the x-intercept. Internal NAT-addition, however, requires multiple analyses of each sample, resulting in increased sample consumption and analysis time. To compare the following three methods, an MRM assay for 34 high-to-moderate abundance human plasma proteins is used: classical internal SIS-addition, internal NAT-addition, and external NAT-addition-generated in buffer using NAT and SIS peptides. Using endogenous-free chicken plasma, the accuracy is also evaluated. The internal NAT-addition outperforms the other two in precision and accuracy. However, the curves derived by internal vs. external NAT-addition differ by only ≈3.8% in slope, providing comparable accuracies and precision with good CV values. While the internal NAT-addition method may be "ideal", this new external NAT-addition can be used to determine the concentration of high-to-moderate abundance endogenous plasma proteins, providing a robust and cost-effective alternative for clinical analyses or other high-throughput applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. High-throughput in vivo genotoxicity testing: an automated readout system for the somatic mutation and recombination test (SMART.

    Directory of Open Access Journals (Sweden)

    Benoit Lombardot

    Full Text Available Genotoxicity testing is an important component of toxicity assessment. As illustrated by the European registration, evaluation, authorization, and restriction of chemicals (REACH directive, it concerns all the chemicals used in industry. The commonly used in vivo mammalian tests appear to be ill adapted to tackle the large compound sets involved, due to throughput, cost, and ethical issues. The somatic mutation and recombination test (SMART represents a more scalable alternative, since it uses Drosophila, which develops faster and requires less infrastructure. Despite these advantages, the manual scoring of the hairs on Drosophila wings required for the SMART limits its usage. To overcome this limitation, we have developed an automated SMART readout. It consists of automated imaging, followed by an image analysis pipeline that measures individual wing genotoxicity scores. Finally, we have developed a wing score-based dose-dependency approach that can provide genotoxicity profiles. We have validated our method using 6 compounds, obtaining profiles almost identical to those obtained from manual measures, even for low-genotoxicity compounds such as urethane. The automated SMART, with its faster and more reliable readout, fulfills the need for a high-throughput in vivo test. The flexible imaging strategy we describe and the analysis tools we provide should facilitate the optimization and dissemination of our methods.

  20. HTTK R Package v1.4 - JSS Article on HTTK: R Package for High-Throughput Toxicokinetics

    Data.gov (United States)

    U.S. Environmental Protection Agency — httk: High-Throughput Toxicokinetics Functions and data tables for simulation and statistical analysis of chemical toxicokinetics ("TK") using data obtained from...

  1. Use of ultra-high pressure liquid chromatography coupled to high resolution mass spectrometry for fast screening in high throughput doping control.

    Science.gov (United States)

    Musenga, Alessandro; Cowan, David A

    2013-05-03

    We describe a sensitive, comprehensive and fast screening method based on liquid chromatography-high resolution mass spectrometry for the detection of a large number of analytes in sports samples. UHPLC coupled to high resolution mass spectrometry with polarity switching capability is applied for the rapid screening of a large number of analytes in human urine samples. Full scan data are acquired alternating both positive and negative ionisation. Collision-induced dissociation with positive ionisation is also performed to produce fragment ions to improve selectivity for some analytes. Data are reviewed as extracted ion chromatograms based on narrow mass/charge windows (±5ppm). A simple sample preparation method was developed, using direct enzymatic hydrolysis of glucuronide conjugates, followed by solid phase extraction with mixed mode ion-exchange cartridges. Within a 10min run time (including re-equilibration) the method presented allows for the analysis of a large number of analytes from most of the classes in the World Anti-Doping Agency (WADA) Prohibited List, including anabolic agents, β2-agonists, hormone antagonists and modulators, diuretics, stimulants, narcotics, glucocorticoids and β-blockers, and does so while meeting the WADA sensitivity requirements. The high throughput of the method and the fast sample pre-treatment reduces analysis cost and increases productivity. The method presented has been used for the analysis of over 5000 samples in about one month and proved to be reliable. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Droplet electrospray ionization mass spectrometry for high throughput screening for enzyme inhibitors.

    Science.gov (United States)

    Sun, Shuwen; Kennedy, Robert T

    2014-09-16

    High throughput screening (HTS) is important for identifying molecules with desired properties. Mass spectrometry (MS) is potentially powerful for label-free HTS due to its high sensitivity, speed, and resolution. Segmented flow, where samples are manipulated as droplets separated by an immiscible fluid, is an intriguing format for high throughput MS because it can be used to reliably and precisely manipulate nanoliter volumes and can be directly coupled to electrospray ionization (ESI) MS for rapid analysis. In this study, we describe a "MS Plate Reader" that couples standard multiwell plate HTS workflow to droplet ESI-MS. The MS plate reader can reformat 3072 samples from eight 384-well plates into nanoliter droplets segmented by an immiscible oil at 4.5 samples/s and sequentially analyze them by MS at 2 samples/s. Using the system, a label-free screen for cathepsin B modulators against 1280 chemicals was completed in 45 min with a high Z-factor (>0.72) and no false positives (24 of 24 hits confirmed). The assay revealed 11 structures not previously linked to cathepsin inhibition. For even larger scale screening, reformatting and analysis could be conducted simultaneously, which would enable more than 145,000 samples to be analyzed in 1 day.

  3. High-throughput screening of small-molecule adsorption in MOF-74

    Science.gov (United States)

    Thonhauser, T.; Canepa, P.

    2014-03-01

    Using high-throughput screening coupled with state-of-the-art van der Waals density functional theory, we investigate the adsorption properties of four important molecules, H2, CO2, CH4, and H2O in MOF-74-  with  = Be, Mg, Al, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr, Zr, Nb, Ru, Rh, Pd, La, W, Os, Ir, and Pt. We show that high-throughput techniques can aid in speeding up the development and refinement of effective materials for hydrogen storage, carbon capture, and gas separation. The exploration of the configurational adsorption space allows us to extract crucial information concerning, for example, the competition of water with CO2 for the adsorption binding sites. We find that only a few noble metals--Rh, Pd, Os, Ir, and Pt--favor the adsorption of CO2 and hence are potential candidates for effective carbon-capture materials. Our findings further reveal significant differences in the binding characteristics of H2, CO2, CH4, and H2O within the MOF structure, indicating that molecular blends can be successfully separated by these nano-porous materials. Supported by DOE DE-FG02-08ER46491.

  4. Fabrication of combinatorial nm-planar electrode array for high throughput evaluation of organic semiconductors

    International Nuclear Information System (INIS)

    Haemori, M.; Edura, T.; Tsutsui, K.; Itaka, K.; Wada, Y.; Koinuma, H.

    2006-01-01

    We have fabricated a combinatorial nm-planar electrode array by using photolithography and chemical mechanical polishing processes for high throughput electrical evaluation of organic devices. Sub-nm precision was achieved with respect to the average level difference between each pair of electrodes and a dielectric layer. The insulating property between the electrodes is high enough to measure I-V characteristics of organic semiconductors. Bottom-contact field-effect-transistors (FETs) of pentacene were fabricated on this electrode array by use of molecular beam epitaxy. It was demonstrated that the array could be used as a pre-patterned device substrate for high throughput screening of the electrical properties of organic semiconductors

  5. GROMACS 4.5: a high-throughput and highly parallel open source molecular simulation toolkit.

    Science.gov (United States)

    Pronk, Sander; Páll, Szilárd; Schulz, Roland; Larsson, Per; Bjelkmar, Pär; Apostolov, Rossen; Shirts, Michael R; Smith, Jeremy C; Kasson, Peter M; van der Spoel, David; Hess, Berk; Lindahl, Erik

    2013-04-01

    Molecular simulation has historically been a low-throughput technique, but faster computers and increasing amounts of genomic and structural data are changing this by enabling large-scale automated simulation of, for instance, many conformers or mutants of biomolecules with or without a range of ligands. At the same time, advances in performance and scaling now make it possible to model complex biomolecular interaction and function in a manner directly testable by experiment. These applications share a need for fast and efficient software that can be deployed on massive scale in clusters, web servers, distributed computing or cloud resources. Here, we present a range of new simulation algorithms and features developed during the past 4 years, leading up to the GROMACS 4.5 software package. The software now automatically handles wide classes of biomolecules, such as proteins, nucleic acids and lipids, and comes with all commonly used force fields for these molecules built-in. GROMACS supports several implicit solvent models, as well as new free-energy algorithms, and the software now uses multithreading for efficient parallelization even on low-end systems, including windows-based workstations. Together with hand-tuned assembly kernels and state-of-the-art parallelization, this provides extremely high performance and cost efficiency for high-throughput as well as massively parallel simulations. GROMACS is an open source and free software available from http://www.gromacs.org. Supplementary data are available at Bioinformatics online.

  6. High-throughput differentiation of heparin from other glycosaminoglycans by pyrolysis mass spectrometry.

    Science.gov (United States)

    Nemes, Peter; Hoover, William J; Keire, David A

    2013-08-06

    Sensors with high chemical specificity and enhanced sample throughput are vital to screening food products and medical devices for chemical or biochemical contaminants that may pose a threat to public health. For example, the rapid detection of oversulfated chondroitin sulfate (OSCS) in heparin could prevent reoccurrence of heparin adulteration that caused hundreds of severe adverse events including deaths worldwide in 2007-2008. Here, rapid pyrolysis is integrated with direct analysis in real time (DART) mass spectrometry to rapidly screen major glycosaminoglycans, including heparin, chondroitin sulfate A, dermatan sulfate, and OSCS. The results demonstrate that, compared to traditional liquid chromatography-based analyses, pyrolysis mass spectrometry achieved at least 250-fold higher sample throughput and was compatible with samples volume-limited to about 300 nL. Pyrolysis yielded an abundance of fragment ions (e.g., 150 different m/z species), many of which were specific to the parent compound. Using multivariate and statistical data analysis models, these data enabled facile differentiation of the glycosaminoglycans with high throughput. After method development was completed, authentically contaminated samples obtained during the heparin crisis by the FDA were analyzed in a blinded manner for OSCS contamination. The lower limit of differentiation and detection were 0.1% (w/w) OSCS in heparin and 100 ng/μL (20 ng) OSCS in water, respectively. For quantitative purposes the linear dynamic range spanned approximately 3 orders of magnitude. Moreover, this chemical readout was successfully employed to find clues in the manufacturing history of the heparin samples that can be used for surveillance purposes. The presented technology and data analysis protocols are anticipated to be readily adaptable to other chemical and biochemical agents and volume-limited samples.

  7. Fine mapping of a Phytophthora-resistance gene RpsWY in soybean (Glycine max L.) by high-throughput genome-wide sequencing.

    Science.gov (United States)

    Cheng, Yanbo; Ma, Qibin; Ren, Hailong; Xia, Qiuju; Song, Enliang; Tan, Zhiyuan; Li, Shuxian; Zhang, Gengyun; Nian, Hai

    2017-05-01

    Using a combination of phenotypic screening, genetic and statistical analyses, and high-throughput genome-wide sequencing, we have finely mapped a dominant Phytophthora resistance gene in soybean cultivar Wayao. Phytophthora root rot (PRR) caused by Phytophthora sojae is one of the most important soil-borne diseases in many soybean-production regions in the world. Identification of resistant gene(s) and incorporating them into elite varieties are an effective way for breeding to prevent soybean from being harmed by this disease. Two soybean populations of 191 F 2 individuals and 196 F 7:8 recombinant inbred lines (RILs) were developed to map Rps gene by crossing a susceptible cultivar Huachun 2 with the resistant cultivar Wayao. Genetic analysis of the F 2 population indicated that PRR resistance in Wayao was controlled by a single dominant gene, temporarily named RpsWY, which was mapped on chromosome 3. A high-density genetic linkage bin map was constructed using 3469 recombination bins of the RILs to explore the candidate genes by the high-throughput genome-wide sequencing. The results of genotypic analysis showed that the RpsWY gene was located in bin 401 between 4466230 and 4502773 bp on chromosome 3 through line 71 and 100 of the RILs. Four predicted genes (Glyma03g04350, Glyma03g04360, Glyma03g04370, and Glyma03g04380) were found at the narrowed region of 36.5 kb in bin 401. These results suggest that the high-throughput genome-wide resequencing is an effective method to fine map PRR candidate genes.

  8. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale L

    2005-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  9. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale

    2004-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  10. A CRISPR CASe for High-Throughput Silencing

    Directory of Open Access Journals (Sweden)

    Jacob eHeintze

    2013-10-01

    Full Text Available Manipulation of gene expression on a genome-wide level is one of the most important systematic tools in the post-genome era. Such manipulations have largely been enabled by expression cloning approaches using sequence-verified cDNA libraries, large-scale RNA interference libraries (shRNA or siRNA and zinc finger nuclease technologies. More recently, the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats and CRISPR-associated (Cas9-mediated gene editing technology has been described that holds great promise for future use of this technology in genomic manipulation. It was suggested that the CRISPR system has the potential to be used in high-throughput, large-scale loss of function screening. Here we discuss some of the challenges in engineering of CRISPR/Cas genomic libraries and some of the aspects that need to be addressed in order to use this technology on a high-throughput scale.

  11. Cost-effectiveness analysis of antithyroid drug therapy, 131I therapy and subtotal thyroidectomy for Graves' disease

    International Nuclear Information System (INIS)

    Yano, Fuzuki; Watanabe, Sadahiro; Hayashi, Katsumi; Kita, Tamotsu; Yamamoto, Masayoshi; Kosuda, Shigeru; Tanaka, Yuji

    2007-01-01

    The objective of this study was to assess the cost-effectiveness of antithyroid drug (ATD) therapy vs. radioiodine therapy (RIT) vs. subtotal thyroidectomy (STT) by calculating expected lifelong cost and utility based on Graves' disease patients' responses to questionnaires using a decision-tree sensitivity analysis and relevant variables. The decision-tree sensitivity analysis to determine expected lifelong cost and utility in Graves' disease patients was designed on the basis of the 4 competing strategies consisting of: (1) ATD therapy plus RIT strategy, (2) ATD therapy plus STT strategy, (3) low-fixed-dose (185 MBq) RIT alone strategy, and (4) high-fixed-dose (370 MBq) RIT alone strategy. One-way sensitivity analysis was designed in the ATD therapy plus RIT strategy, for replacement with RIT in place of ATD, ranging from a 1% incidence of ATD side effects to 30%. The low-fixed-dose RIT alone strategy was least costly, and the high-fixed-dose RIT alone strategy most costly. The lifelong utility of high-fixed-dose RIT alone strategy with a 5% rate of discounting was highest (lifelong utility for 30 years: 15.2/patient), and the utility of the ATD plus RIT strategy with 1% side effects of the ATD was lowest (14.1/patient). The cost-effectiveness ratio was lowest (yen 5 008/utility) in a low-fixed-dose RIT alone strategy. In conclusion, a low-fixed-dose RIT alone strategy is preferred treatments in view of cost-effectiveness ratio, and RIT should be used more widely in Japan. (author)

  12. Analysis of high-throughput biological data using their rank values.

    Science.gov (United States)

    Dembélé, Doulaye

    2018-01-01

    High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .

  13. A cell-based high-throughput screening assay for radiation susceptibility using automated cell counting

    International Nuclear Information System (INIS)

    Hodzic, Jasmina; Dingjan, Ilse; Maas, Mariëlle JP; Meulen-Muileman, Ida H van der; Menezes, Renee X de; Heukelom, Stan; Verheij, Marcel; Gerritsen, Winald R; Geldof, Albert A; Triest, Baukelien van; Beusechem, Victor W van

    2015-01-01

    Radiotherapy is one of the mainstays in the treatment for cancer, but its success can be limited due to inherent or acquired resistance. Mechanisms underlying radioresistance in various cancers are poorly understood and available radiosensitizers have shown only modest clinical benefit. There is thus a need to identify new targets and drugs for more effective sensitization of cancer cells to irradiation. Compound and RNA interference high-throughput screening technologies allow comprehensive enterprises to identify new agents and targets for radiosensitization. However, the gold standard assay to investigate radiosensitivity of cancer cells in vitro, the colony formation assay (CFA), is unsuitable for high-throughput screening. We developed a new high-throughput screening method for determining radiation susceptibility. Fast and uniform irradiation of batches up to 30 microplates was achieved using a Perspex container and a clinically employed linear accelerator. The readout was done by automated counting of fluorescently stained nuclei using the Acumen eX3 laser scanning cytometer. Assay performance was compared to that of the CFA and the CellTiter-Blue homogeneous uniform-well cell viability assay. The assay was validated in a whole-genome siRNA library screening setting using PC-3 prostate cancer cells. On 4 different cancer cell lines, the automated cell counting assay produced radiation dose response curves that followed a linear-quadratic equation and that exhibited a better correlation to the results of the CFA than did the cell viability assay. Moreover, the cell counting assay could be used to detect radiosensitization by silencing DNA-PKcs or by adding caffeine. In a high-throughput screening setting, using 4 Gy irradiated and control PC-3 cells, the effects of DNA-PKcs siRNA and non-targeting control siRNA could be clearly discriminated. We developed a simple assay for radiation susceptibility that can be used for high-throughput screening. This will aid

  14. Cost-Effectiveness Analysis of Three Leprosy Case Detection Methods in Northern Nigeria

    Science.gov (United States)

    Ezenduka, Charles; Post, Erik; John, Steven; Suraj, Abdulkarim; Namadi, Abdulahi; Onwujekwe, Obinna

    2012-01-01

    Background Despite several leprosy control measures in Nigeria, child proportion and disability grade 2 cases remain high while new cases have not significantly reduced, suggesting continuous spread of the disease. Hence, there is the need to review detection methods to enhance identification of early cases for effective control and prevention of permanent disability. This study evaluated the cost-effectiveness of three leprosy case detection methods in Northern Nigeria to identify the most cost-effective approach for detection of leprosy. Methods A cross-sectional study was carried out to evaluate the additional benefits of using several case detection methods in addition to routine practice in two north-eastern states of Nigeria. Primary and secondary data were collected from routine practice records and the Nigerian Tuberculosis and Leprosy Control Programme of 2009. The methods evaluated were Rapid Village Survey (RVS), Household Contact Examination (HCE) and Traditional Healers incentive method (TH). Effectiveness was measured as number of new leprosy cases detected and cost-effectiveness was expressed as cost per case detected. Costs were measured from both providers' and patients' perspectives. Additional costs and effects of each method were estimated by comparing each method against routine practise and expressed as incremental cost-effectiveness ratio (ICER). All costs were converted to the U.S. dollar at the 2010 exchange rate. Univariate sensitivity analysis was used to evaluate uncertainties around the ICER. Results The ICER for HCE was $142 per additional case detected at all contact levels and it was the most cost-effective method. At ICER of $194 per additional case detected, THs method detected more cases at a lower cost than the RVS, which was not cost-effective at $313 per additional case detected. Sensitivity analysis showed that varying the proportion of shared costs and subsistent wage for valuing unpaid time did not significantly change the

  15. A cost-effectiveness and cost-utility analysis of radiosurgery vs. resection for single-brain metastases

    International Nuclear Information System (INIS)

    Mehta, Minesh; Noyes, William; Craig, Bruce; Lamond, John; Auchter, Richard; French, Molly; Johnson, Mark; Levin, Allan; Badie, Behnam; Robbins, Ian; Kinsella, Timothy

    1997-01-01

    Purpose: The median survival of well-selected patients with single-brain metastases treated with whole-brain irradiation and resection or radiosurgery is comparable, although a randomized trial of these two modalities has not been performed. In this era of cost containment, it is imperative that health-care professionals make fiscally prudent decisions. The present environment necessitates a critical appraisal of apparently equi-efficacious therapeutic modalities, and it is within this context that we present a comparison of the actual costs of resection and radiosurgery for brain metastases. Methods and Materials: Survival and quality of life outcome data for radiation alone or with surgery were obtained from two randomized trials, and radiosurgical results were obtained from a multiinstitutional analysis that specifically evaluated patients meeting surgical criteria. Only linear accelerator radiosurgery data were considered. Cost analysis was performed from a societal view point, and the following parameters were evaluated: actual cost, cost ratios, cost effectiveness, incremental cost effectiveness, cost utility, incremental cost utility, and national cost burden. The computerized billing records for all patients undergoing resection or radiosurgery for single-brain metastases from January 1989 to July 1994 were reviewed. A total of 46 resections and 135 radiosurgery procedures were performed. During the same time period, 454 patients underwent whole-brain radiation alone. An analysis of the entire bill was performed for each procedure, and each itemized cost was assigned a proportionate figure. The relative cost ratios of resection and radiosurgery were compared using the Wilcoxon rank sum test. Cost effectiveness of each modality, defined as the cost per year of median survival, was evaluated. Incremental cost effectiveness, defined as the additional cost per year of incremental gain in median survival, compared to the next least expensive modality, was also

  16. Cost-effectiveness and public health impact of alternative influenza vaccination strategies in high-risk adults.

    Science.gov (United States)

    Raviotta, Jonathan M; Smith, Kenneth J; DePasse, Jay; Brown, Shawn T; Shim, Eunha; Nowalk, Mary Patricia; Wateska, Angela; France, Glenson S; Zimmerman, Richard K

    2017-10-09

    High-dose trivalent inactivated influenza vaccine (HD-IIV3) or recombinant trivalent influenza vaccine (RIV) may increase influenza vaccine effectiveness (VE) in adults with conditions that place them at high risk for influenza complications. This analysis models the public health impact and cost-effectiveness (CE) of these vaccines for 50-64year-olds. Markov model CE analysis compared 5 strategies in 50-64year-olds: no vaccination; only standard-dose IIV3 offered (SD-IIV3 only), only quadrivalent influenza vaccine offered (SD-IIV4 only); high-risk patients receiving HD-IIV3, others receiving SD-IIV3 (HD-IIV3 & SD-IIV3); and high-risk patients receiving HD-IIV3, others receiving SD-IIV4 (HD-IIV3 & SD-IIV4). In a secondary analysis, RIV replaced HD-IIV3. Parameters were obtained from U.S. databases, the medical literature and extrapolations from VE estimates. Effectiveness was measured as 3%/year discounted quality adjusted life year (QALY) losses avoided. The least expensive strategy was SD-IIV3 only, with total costs of $99.84/person. The SD-IIV4 only strategy cost an additional $0.91/person, or $37,700/QALY gained. The HD-IIV3 & SD-IIV4 strategy cost $1.06 more than SD-IIV4 only, or $71,500/QALY gained. No vaccination and HD-IIV3 & SD-IIV3 strategies were dominated. Results were sensitive to influenza incidence, vaccine cost, standard-dose VE in the entire population and high-dose VE in high-risk patients. The CE of RIV for high-risk patients was dependent on as yet unknown parameter values. Based on available data, using high-dose influenza vaccine or RIV in middle-aged, high-risk patients may be an economically favorable vaccination strategy with public health benefits. Clinical trials of these vaccines in this population may be warranted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. High throughput route selection in multi-rate wireless mesh networks

    Institute of Scientific and Technical Information of China (English)

    WEI Yi-fei; GUO Xiang-li; SONG Mei; SONG Jun-de

    2008-01-01

    Most existing Ad-hoc routing protocols use the shortest path algorithm with a hop count metric to select paths. It is appropriate in single-rate wireless networks, but has a tendency to select paths containing long-distance links that have low data rates and reduced reliability in multi-rate networks. This article introduces a high throughput routing algorithm utilizing the multi-rate capability and some mesh characteristics in wireless fidelity (WiFi) mesh networks. It uses the medium access control (MAC) transmission time as the routing metric, which is estimated by the information passed up from the physical layer. When the proposed algorithm is adopted, the Ad-hoc on-demand distance vector (AODV) routing can be improved as high throughput AODV (HT-AODV). Simulation results show that HT-AODV is capable of establishing a route that has high data-rate, short end-to-end delay and great network throughput.

  18. Metabolic enzyme microarray coupled with miniaturized cell-culture array technology for high-throughput toxicity screening.

    Science.gov (United States)

    Lee, Moo-Yeal; Dordick, Jonathan S; Clark, Douglas S

    2010-01-01

    Due to poor drug candidate safety profiles that are often identified late in the drug development process, the clinical progression of new chemical entities to pharmaceuticals remains hindered, thus resulting in the high cost of drug discovery. To accelerate the identification of safer drug candidates and improve the clinical progression of drug candidates to pharmaceuticals, it is important to develop high-throughput tools that can provide early-stage predictive toxicology data. In particular, in vitro cell-based systems that can accurately mimic the human in vivo response and predict the impact of drug candidates on human toxicology are needed to accelerate the assessment of drug candidate toxicity and human metabolism earlier in the drug development process. The in vitro techniques that provide a high degree of human toxicity prediction will be perhaps more important in cosmetic and chemical industries in Europe, as animal toxicity testing is being phased out entirely in the immediate future.We have developed a metabolic enzyme microarray (the Metabolizing Enzyme Toxicology Assay Chip, or MetaChip) and a miniaturized three-dimensional (3D) cell-culture array (the Data Analysis Toxicology Assay Chip, or DataChip) for high-throughput toxicity screening of target compounds and their metabolic enzyme-generated products. The human or rat MetaChip contains an array of encapsulated metabolic enzymes that is designed to emulate the metabolic reactions in the human or rat liver. The human or rat DataChip contains an array of 3D human or rat cells encapsulated in alginate gels for cell-based toxicity screening. By combining the DataChip with the complementary MetaChip, in vitro toxicity results are obtained that correlate well with in vivo rat data.

  19. Radiation metabolomics : a window to high throughput radiation biodosimetry

    International Nuclear Information System (INIS)

    Rana, Poonam

    2016-01-01

    In the event of an intentional or accidental release of ionizing radiation in a densely populated area, timely assessment and triage of the general population for radiation exposure is critical. In particular, a significant number of victims may sustain radiation injury, which increases mortality and worsens the overall prognosis of victims from radiation trauma. Availability of a high-throughput noninvasive in vivo biodosimetry tool for assessing the radiation exposure is of particular importance for timely diagnosis of radiation injury. In this study, we describe the potential NMR techniques in evaluating the radiation injury. NMR is the most versatile technique that has been extensively used in the diverse fields of science since its discovery. NMR and biomedical sciences have been going hand in hand since its application in clinical imaging as MRI and metabolic profiling of biofluids was identified. We have established an NMR based metabonomic and in vivo spectroscopy approach to analyse and identify metabolic profile to measure metabolic fingerprint for radiation exposure. NMR spectroscopy experiments were conducted on urine and serum samples collected from mice irradiated with different doses of radiation. Additionally, in vivo NMR spectroscopy was also performed in different region of brains post irradiation in animal model. A number of metabolites associated with energy metabolism, gut flora metabolites, osmolytes, amino acids and membrane metabolism were identified in serum and urine metabolome. Our results illustrated a metabolic fingerprint for radiation exposure that elucidates perturbed physiological functions. Quantitative as well as multivariate analysis/assessment of these metabolites demonstrated dose and time dependent toxicological effect. In vivo spectroscopy from brain showed radiation induced changes in hippocampus region indicating whole body radiation had striking effect on brain metabolism as well. The results of the present work lay a

  20. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    Science.gov (United States)

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  1. Geochip: A high throughput genomic tool for linking community structure to functions

    Energy Technology Data Exchange (ETDEWEB)

    Van Nostrand, Joy D.; Liang, Yuting; He, Zhili; Li, Guanghe; Zhou, Jizhong

    2009-01-30

    GeoChip is a comprehensive functional gene array that targets key functional genes involved in the geochemical cycling of N, C, and P, sulfate reduction, metal resistance and reduction, and contaminant degradation. Studies have shown the GeoChip to be a sensitive, specific, and high-throughput tool for microbial community analysis that has the power to link geochemical processes with microbial community structure. However, several challenges remain regarding the development and applications of microarrays for microbial community analysis.

  2. A Cost-effectiveness Analysis of Early vs Late Tracheostomy.

    Science.gov (United States)

    Liu, C Carrie; Rudmik, Luke

    2016-10-01

    The timing of tracheostomy in critically ill patients requiring mechanical ventilation is controversial. An important consideration that is currently missing in the literature is an evaluation of the economic impact of an early tracheostomy strategy vs a late tracheostomy strategy. To evaluate the cost-effectiveness of the early tracheostomy strategy vs the late tracheostomy strategy. This economic analysis was performed using a decision tree model with a 90-day time horizon. The economic perspective was that of the US health care third-party payer. The primary outcome was the incremental cost per tracheostomy avoided. Probabilities were obtained from meta-analyses of randomized clinical trials. Costs were obtained from the published literature and the Healthcare Cost and Utilization Project database. A multivariate probabilistic sensitivity analysis was performed to account for uncertainty surrounding mean values used in the reference case. The reference case demonstrated that the cost of the late tracheostomy strategy was $45 943.81 for 0.36 of effectiveness. The cost of the early tracheostomy strategy was $31 979.12 for 0.19 of effectiveness. The incremental cost-effectiveness ratio for the late tracheostomy strategy compared with the early tracheostomy strategy was $82 145.24 per tracheostomy avoided. With a willingness-to-pay threshold of $50 000, the early tracheostomy strategy is cost-effective with 56% certainty. The adaptation of an early vs a late tracheostomy strategy depends on the priorities of the decision-maker. Up to a willingness-to-pay threshold of $80 000 per tracheostomy avoided, the early tracheostomy strategy has a higher probability of being the more cost-effective intervention.

  3. Cost-effectiveness analysis in minimally invasive spine surgery.

    Science.gov (United States)

    Al-Khouja, Lutfi T; Baron, Eli M; Johnson, J Patrick; Kim, Terrence T; Drazin, Doniel

    2014-06-01

    Medical care has been evolving with the increased influence of a value-based health care system. As a result, more emphasis is being placed on ensuring cost-effectiveness and utility in the services provided to patients. This study looks at this development in respect to minimally invasive spine surgery (MISS) costs. A literature review using PubMed, the Cost-Effectiveness Analysis (CEA) Registry, and the National Health Service Economic Evaluation Database (NHS EED) was performed. Papers were included in the study if they reported costs associated with minimally invasive spine surgery (MISS). If there was no mention of cost, CEA, cost-utility analysis (CUA), quality-adjusted life year (QALY), quality, or outcomes mentioned, then the article was excluded. Fourteen studies reporting costs associated with MISS in 12,425 patients (3675 undergoing minimally invasive procedures and 8750 undergoing open procedures) were identified through PubMed, the CEA Registry, and NHS EED. The percent cost difference between minimally invasive and open approaches ranged from 2.54% to 33.68%-all indicating cost saving with a minimally invasive surgical approach. Average length of stay (LOS) for minimally invasive surgery ranged from 0.93 days to 5.1 days compared with 1.53 days to 12 days for an open approach. All studies reporting EBL reported lower volume loss in an MISS approach (range 10-392.5 ml) than in an open approach (range 55-535.5 ml). There are currently an insufficient number of studies published reporting the costs of MISS. Of the studies published, none have followed a standardized method of reporting and analyzing cost data. Preliminary findings analyzing the 14 studies showed both cost saving and better outcomes in MISS compared with an open approach. However, more Level I CEA/CUA studies including cost/QALY evaluations with specifics of the techniques utilized need to be reported in a standardized manner to make more accurate conclusions on the cost effectiveness of

  4. High-throughput phenotyping and genomic selection: the frontiers of crop breeding converge.

    Science.gov (United States)

    Cabrera-Bosquet, Llorenç; Crossa, José; von Zitzewitz, Jarislav; Serret, María Dolors; Araus, José Luis

    2012-05-01

    Genomic selection (GS) and high-throughput phenotyping have recently been captivating the interest of the crop breeding community from both the public and private sectors world-wide. Both approaches promise to revolutionize the prediction of complex traits, including growth, yield and adaptation to stress. Whereas high-throughput phenotyping may help to improve understanding of crop physiology, most powerful techniques for high-throughput field phenotyping are empirical rather than analytical and comparable to genomic selection. Despite the fact that the two methodological approaches represent the extremes of what is understood as the breeding process (phenotype versus genome), they both consider the targeted traits (e.g. grain yield, growth, phenology, plant adaptation to stress) as a black box instead of dissecting them as a set of secondary traits (i.e. physiological) putatively related to the target trait. Both GS and high-throughput phenotyping have in common their empirical approach enabling breeders to use genome profile or phenotype without understanding the underlying biology. This short review discusses the main aspects of both approaches and focuses on the case of genomic selection of maize flowering traits and near-infrared spectroscopy (NIRS) and plant spectral reflectance as high-throughput field phenotyping methods for complex traits such as crop growth and yield. © 2012 Institute of Botany, Chinese Academy of Sciences.

  5. Ethical objections against including life-extension costs in cost-effectiveness analysis: a consistent approach.

    Science.gov (United States)

    Gandjour, Afschin; Müller, Dirk

    2014-10-01

    One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness.

  6. HTTK: R Package for High-Throughput Toxicokinetics

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  7. An improved set of standards for finding cost for cost-effectiveness analysis.

    Science.gov (United States)

    Barnett, Paul G

    2009-07-01

    Guidelines have helped standardize methods of cost-effectiveness analysis, allowing different interventions to be compared and enhancing the generalizability of study findings. There is agreement that all relevant services be valued from the societal perspective using a long-term time horizon and that more exact methods be used to cost services most affected by the study intervention. Guidelines are not specific enough with respect to costing methods, however. The literature was reviewed to identify the problems associated with the 4 principal methods of cost determination. Microcosting requires direct measurement and is ordinarily reserved to cost novel interventions. Analysts should include nonwage labor cost, person-level and institutional overhead, and the cost of development, set-up activities, supplies, space, and screening. Activity-based cost systems have promise of finding accurate costs of all services provided, but are not widely adopted. Quality must be evaluated and the generalizability of cost estimates to other settings must be considered. Administrative cost estimates, chiefly cost-adjusted charges, are widely used, but the analyst must consider items excluded from the available system. Gross costing methods determine quantity of services used and employ a unit cost. If the intervention will affect the characteristics of a service, the method should not assume that the service is homogeneous. Questions are posed for future reviews of the quality of costing methods. The analyst must avoid inappropriate assumptions, especially those that bias the analysis by exclusion of costs that are affected by the intervention under study.

  8. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  9. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  10. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  11. Cost-effectiveness analysis of algae energy production in the EU

    International Nuclear Information System (INIS)

    Kovacevic, Vujadin; Wesseler, Justus

    2010-01-01

    Today's society relies heavily on fossil fuels as a main energy source. Global energy demand increase, energy security and climate change are the main drivers of the transition towards alternative energy sources. This paper analyses algal biodiesel production for the EU road transportation and compares it to the fossil fuels and 1st generation biofuels. A cost-effectiveness analysis was used to aggregate private and external costs and derive the social cost of each fuel. The following externalities were internalized: emissions (GHG and non-GHG), food prices impact, pesticides/fertilizers use and security of supply. Currently the social cost of producing algal biodiesel at 52.3 EUR GJ -1 is higher than rapeseed biodiesel (36.0 EUR GJ -1 ) and fossil fuels (15.8 EUR GJ -1 ). Biotechnology development, high crude oil prices and high carbon value are the key features of the scenario where algal biodiesel outcompetes all other fuels. A substantial investment into the biotechnology sector and comprehensive environmental research and policy are required to make that scenario a reality. (author)

  12. Economic analysis of three interventions of different intensity in improving school implementation of a government healthy canteen policy in Australia: costs, incremental and relative cost effectiveness.

    Science.gov (United States)

    Reilly, Kathryn L; Reeves, Penny; Deeming, Simon; Yoong, Sze Lin; Wolfenden, Luke; Nathan, Nicole; Wiggers, John

    2018-03-20

    No evaluations of the cost or cost effectiveness of interventions to increase school implementation of food availability policies have been reported. Government and non-government agency decisions regarding the extent of investment required to enhance school implementation of such policies are unsupported by such evidence. This study sought to i) Determine cost and cost-effectiveness of three interventions in improving school implementation of an Australian government healthy canteen policy and; ii) Determine the relative cost-effectiveness of the interventions in improving school implementation of such a policy. An analysis of the cost and cost-effectiveness of three implementation interventions of varying support intensity, relative to usual implementation support conducted during 2013-2015 was undertaken. Secondly, an indirect comparison of the trials was undertaken to determine the most cost-effective of the three strategies. The economic analysis was based on the cost of delivering the interventions by health service delivery staff to increase the proportion of schools 'adherent' with the policy. The total costs per school were $166,971, $70,926 and $75,682 for the high, medium and low intensity interventions respectively. Compared to usual support, the cost effectiveness ratios for each of the three interventions were: A$2982 (high intensity), A$2627 (medium intensity) and A$4730 (low intensity) per percent increase in proportion of schools reporting 'adherence'). Indirect comparison between the 'high' and 'medium intensity' interventions showed no statistically significant difference in cost-effectiveness. The results indicate that while the cost profiles of the interventions varied substantially, the cost-effectiveness did not. This result is valuable to policy makers seeking cost-effective solutions that can be delivered within budget.

  13. High-throughput screening of ionic conductivity in polymer membranes

    International Nuclear Information System (INIS)

    Zapata, Pedro; Basak, Pratyay; Carson Meredith, J.

    2009-01-01

    Combinatorial and high-throughput techniques have been successfully used for efficient and rapid property screening in multiple fields. The use of these techniques can be an advantageous new approach to assay ionic conductivity and accelerate the development of novel materials in research areas such as fuel cells. A high-throughput ionic conductivity (HTC) apparatus is described and applied to screening candidate polymer electrolyte membranes for fuel cell applications. The device uses a miniature four-point probe for rapid, automated point-to-point AC electrochemical impedance measurements in both liquid and humid air environments. The conductivity of Nafion 112 HTC validation standards was within 1.8% of the manufacturer's specification. HTC screening of 40 novel Kynar poly(vinylidene fluoride) (PVDF)/acrylic polyelectrolyte (PE) membranes focused on varying the Kynar type (5x) and PE composition (8x) using reduced sample sizes. Two factors were found to be significant in determining the proton conducting capacity: (1) Kynar PVDF series: membranes containing a particular Kynar PVDF type exhibited statistically identical mean conductivity as other membranes containing different Kynar PVDF types that belong to the same series or family. (2) Maximum effective amount of polyelectrolyte: increments in polyelectrolyte content from 55 wt% to 60 wt% showed no statistically significant effect in increasing conductivity. In fact, some membranes experienced a reduction in conductivity.

  14. Predictive Power of Machine Learning for Optimizing Solar Water Heater Performance: The Potential Application of High-Throughput Screening

    Directory of Open Access Journals (Sweden)

    Hao Li

    2017-01-01

    Full Text Available Predicting the performance of solar water heater (SWH is challenging due to the complexity of the system. Fortunately, knowledge-based machine learning can provide a fast and precise prediction method for SWH performance. With the predictive power of machine learning models, we can further solve a more challenging question: how to cost-effectively design a high-performance SWH? Here, we summarize our recent studies and propose a general framework of SWH design using a machine learning-based high-throughput screening (HTS method. Design of water-in-glass evacuated tube solar water heater (WGET-SWH is selected as a case study to show the potential application of machine learning-based HTS to the design and optimization of solar energy systems.

  15. Cost, affordability and cost-effectiveness of strategies to control tuberculosis in countries with high HIV prevalence

    Directory of Open Access Journals (Sweden)

    Williams Brian G

    2005-12-01

    Full Text Available Abstract Background The HIV epidemic has caused a dramatic increase in tuberculosis (TB in East and southern Africa. Several strategies have the potential to reduce the burden of TB in high HIV prevalence settings, and cost and cost-effectiveness analyses can help to prioritize them when budget constraints exist. However, published cost and cost-effectiveness studies are limited. Methods Our objective was to compare the cost, affordability and cost-effectiveness of seven strategies for reducing the burden of TB in countries with high HIV prevalence. A compartmental difference equation model of TB and HIV and recent cost data were used to assess the costs (year 2003 US$ prices and effects (TB cases averted, deaths averted, DALYs gained of these strategies in Kenya during the period 2004–2023. Results The three lowest cost and most cost-effective strategies were improving TB cure rates, improving TB case detection rates, and improving both together. The incremental cost of combined improvements to case detection and cure was below US$15 million per year (7.5% of year 2000 government health expenditure; the mean cost per DALY gained of these three strategies ranged from US$18 to US$34. Antiretroviral therapy (ART had the highest incremental costs, which by 2007 could be as large as total government health expenditures in year 2000. ART could also gain more DALYs than the other strategies, at a cost per DALY gained of around US$260 to US$530. Both the costs and effects of treatment for latent tuberculosis infection (TLTI for HIV+ individuals were low; the cost per DALY gained ranged from about US$85 to US$370. Averting one HIV infection for less than US$250 would be as cost-effective as improving TB case detection and cure rates to WHO target levels. Conclusion To reduce the burden of TB in high HIV prevalence settings, the immediate goal should be to increase TB case detection rates and, to the extent possible, improve TB cure rates, preferably

  16. High throughput generation and trapping of individual agarose microgel using microfluidic approach

    KAUST Repository

    Shi, Yang

    2013-02-28

    Microgel is a kind of biocompatible polymeric material, which has been widely used as micro-carriers in materials synthesis, drug delivery and cell biology applications. However, high-throughput generation of individual microgel for on-site analysis in a microdevice still remains a challenge. Here, we presented a simple and stable droplet microfluidic system to realize high-throughput generation and trapping of individual agarose microgels based on the synergetic effect of surface tension and hydrodynamic forces in microchannels and used it for 3-D cell culture in real-time. The established system was mainly composed of droplet generators with flow focusing T-junction and a series of array individual trap structures. The whole process including the independent agarose microgel formation, immobilization in trapping array and gelation in situ via temperature cooling could be realized on the integrated microdevice completely. The performance of this system was demonstrated by successfully encapsulating and culturing adenoid cystic carcinoma (ACCM) cells in the gelated agarose microgels. This established approach is simple, easy to operate, which can not only generate the micro-carriers with different components in parallel, but also monitor the cell behavior in 3D matrix in real-time. It can also be extended for applications in the area of material synthesis and tissue engineering. © 2013 Springer-Verlag Berlin Heidelberg.

  17. High throughput generation and trapping of individual agarose microgel using microfluidic approach

    KAUST Repository

    Shi, Yang; Gao, Xinghua; Chen, Longqing; Zhang, Min; Ma, Jingyun; Zhang, Xixiang; Qin, Jianhua

    2013-01-01

    Microgel is a kind of biocompatible polymeric material, which has been widely used as micro-carriers in materials synthesis, drug delivery and cell biology applications. However, high-throughput generation of individual microgel for on-site analysis in a microdevice still remains a challenge. Here, we presented a simple and stable droplet microfluidic system to realize high-throughput generation and trapping of individual agarose microgels based on the synergetic effect of surface tension and hydrodynamic forces in microchannels and used it for 3-D cell culture in real-time. The established system was mainly composed of droplet generators with flow focusing T-junction and a series of array individual trap structures. The whole process including the independent agarose microgel formation, immobilization in trapping array and gelation in situ via temperature cooling could be realized on the integrated microdevice completely. The performance of this system was demonstrated by successfully encapsulating and culturing adenoid cystic carcinoma (ACCM) cells in the gelated agarose microgels. This established approach is simple, easy to operate, which can not only generate the micro-carriers with different components in parallel, but also monitor the cell behavior in 3D matrix in real-time. It can also be extended for applications in the area of material synthesis and tissue engineering. © 2013 Springer-Verlag Berlin Heidelberg.

  18. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  19. Validation of a Microscale Extraction and High Throughput UHPLC-QTOF-MS Analysis Method for Huperzine A in Huperzia

    Science.gov (United States)

    Cuthbertson, Daniel; Piljac-Žegarac, Jasenka; Lange, Bernd Markus

    2011-01-01

    Herein we report on an improved method for the microscale extraction of huperzine A (HupA), an acetylcholinesterase-inhibiting alkaloid, from as little as 3 mg of tissue homogenate from the clubmoss Huperzia squarrosa (G. Forst.) Trevis with 99.95 % recovery. We also validated a novel UHPLC-QTOF-MS method for the high-throughput analysis of H. squarrosa extracts in only 6 min, which, in combination with the very low limit of detection (20 pg on column) and the wide linear range for quantification (20 to 10,000 pg on column), allow for a highly efficient screening of extracts containing varying amounts of HupA. Utilization of this methodology has the potential to conserve valuable plant resources. PMID:22275140

  20. High throughput "omics" approaches to assess the effects of phytochemicals in human health studies

    Czech Academy of Sciences Publication Activity Database

    Ovesná, J.; Slabý, O.; Toussaint, O.; Kodíček, M.; Maršík, Petr; Pouchová, V.; Vaněk, Tomáš

    2008-01-01

    Roč. 99, E-S1 (2008), ES127-ES134 ISSN 0007-1145 R&D Projects: GA MŠk(CZ) 1P05OC054 Institutional research plan: CEZ:AV0Z50380511 Keywords : Nutrigenomics * Phytochemicals * High throughput platforms Subject RIV: GM - Food Processing Impact factor: 2.764, year: 2008

  1. High-throughput phenotyping of large wheat breeding nurseries using unmanned aerial system, remote sensing and GIS techniques

    Science.gov (United States)

    Haghighattalab, Atena

    Wheat breeders are in a race for genetic gain to secure the future nutritional needs of a growing population. Multiple barriers exist in the acceleration of crop improvement. Emerging technologies are reducing these obstacles. Advances in genotyping technologies have significantly decreased the cost of characterizing the genetic make-up of candidate breeding lines. However, this is just part of the equation. Field-based phenotyping informs a breeder's decision as to which lines move forward in the breeding cycle. This has long been the most expensive and time-consuming, though most critical, aspect of breeding. The grand challenge remains in connecting genetic variants to observed phenotypes followed by predicting phenotypes based on the genetic composition of lines or cultivars. In this context, the current study was undertaken to investigate the utility of UAS in assessment field trials in wheat breeding programs. The major objective was to integrate remotely sensed data with geospatial analysis for high throughput phenotyping of large wheat breeding nurseries. The initial step was to develop and validate a semi-automated high-throughput phenotyping pipeline using a low-cost UAS and NIR camera, image processing, and radiometric calibration to build orthomosaic imagery and 3D models. The relationship between plot-level data (vegetation indices and height) extracted from UAS imagery and manual measurements were examined and found to have a high correlation. Data derived from UAS imagery performed as well as manual measurements while exponentially increasing the amount of data available. The high-resolution, high-temporal HTP data extracted from this pipeline offered the opportunity to develop a within season grain yield prediction model. Due to the variety in genotypes and environmental conditions, breeding trials are inherently spatial in nature and vary non-randomly across the field. This makes geographically weighted regression models a good choice as a

  2. Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections

    Science.gov (United States)

    Tchorowski, Nicole; Murawski, Robert

    2014-01-01

    New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option

  3. Cost effectiveness analysis in radiopharmacy

    International Nuclear Information System (INIS)

    Carpentier, N.; Verbeke, S.; Ducloux, T.

    1999-01-01

    Objective: to evaluate the cost effectiveness of radiopharmaceuticals and their quality control. Materials and methods: this retrospective study was made in the Nuclear Medicine Department of the University Hospital of Limoges. Radiopharmaceutical costs were obtained with adding the price of the radiotracer, the materials, the equipments, the labour, the running expenses and the radioisotope. The costs of quality control were obtained with adding the price of labour, materials, equipments, running expenses and the cost of the quality control of 99m Tc eluate. Results: during 1998, 2106 radiopharmaceuticals were prepared in the Nuclear Medicine Department. The mean cost effectiveness of radiopharmaceutical was 1430 francs (846 to 4260). The mean cost effectiveness of quality control was 163 francs (84 to 343). The rise of the radiopharmaceutical cost induced by quality control was 11%. Conclusion: the technical methodology of quality control must be mastered to optimize the cost of this operation. (author)

  4. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis.

    Science.gov (United States)

    Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren

    2016-11-01

    Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is

  5. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    Science.gov (United States)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  6. Cost-Effectiveness Analysis of Different Genetic Testing Strategies for Lynch Syndrome in Taiwan.

    Directory of Open Access Journals (Sweden)

    Ying-Erh Chen

    Full Text Available Patients with Lynch syndrome (LS have a significantly increased risk of developing colorectal cancer (CRC and other cancers. Genetic screening for LS among patients with newly diagnosed CRC aims to identify mutations in the disease-causing genes (i.e., the DNA mismatch repair genes in the patients, to offer genetic testing for relatives of the patients with the mutations, and then to provide early prevention for the relatives with the mutations. Several genetic tests are available for LS, such as DNA sequencing for MMR genes and tumor testing using microsatellite instability and immunohistochemical analyses. Cost-effectiveness analyses of different genetic testing strategies for LS have been performed in several studies from different countries such as the US and Germany. However, a cost-effectiveness analysis for the testing has not yet been performed in Taiwan. In this study, we evaluated the cost-effectiveness of four genetic testing strategies for LS described in previous studies, while population-specific parameters, such as the mutation rates of the DNA mismatch repair genes and treatment costs for CRC in Taiwan, were used. The incremental cost-effectiveness ratios based on discounted life years gained due to genetic screening were calculated for the strategies relative to no screening and to the previous strategy. Using the World Health Organization standard, which was defined based on Taiwan's Gross Domestic Product per capita, the strategy based on immunohistochemistry as a genetic test followed by BRAF mutation testing was considered to be highly cost-effective relative to no screening. Our probabilistic sensitivity analysis results also suggest that the strategy has a probability of 0.939 of being cost-effective relative to no screening based on the commonly used threshold of $50,000 to determine cost-effectiveness. To the best of our knowledge, this is the first cost-effectiveness analysis for evaluating different genetic testing

  7. Above Bonneville passage and propagation cost effectiveness analysis

    International Nuclear Information System (INIS)

    Paulsen, C.M.; Hyman, J.B.; Wernstedt, K.

    1993-05-01

    We have developed several models to evaluate the cost-effectiveness of alternative strategies to mitigate hydrosystem impacts on salmon and steelhead, and applied these models to areas of the Columbia River Basin. Our latest application evaluates the cost-effectiveness of proposed strategies that target mainstem survival (e.g., predator control, increases in water velocity) and subbasin propagation (e.g., habitat improvements, screening, hatchery production increases) for chinook salmon and steelhead stocks, in the portion of the Columbia Basin bounded by Bonneville, Chief Joseph, Dworshak, and Hells Canyon darns. At its core the analysis primarily considers financial cost and biological effectiveness, but we have included other attributes which may be of concern to the region

  8. Above Bonneville Passage and Propagation Cost Effectiveness Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, C.M.; Hyman, J.B.; Wernstedt, K.

    1993-05-01

    We have developed several models to evaluate the cost-effectiveness of alternative strategies to mitigate hydrosystem impacts on salmon and steelhead, and applied these models to areas of the Columbia River Basin. Our latest application evaluates the cost-effectiveness of proposed strategies that target mainstem survival (e.g., predator control, increases in water velocity) and subbasin propagation (e.g., habitat improvements, screening, hatchery production increases) for chinook salmon and steelhead stocks, in the portion of the Columbia Basin bounded by Bonneville, Chief Joseph, Dworshak, and Hells Canyon darns. At its core the analysis primarily considers financial cost and biological effectiveness, but we have included other attributes which may be of concern to the region.

  9. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    Science.gov (United States)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  10. Cost-effectiveness analysis: what it really means for transfusion medicine decision making.

    Science.gov (United States)

    Custer, Brian; Hoch, Jeffrey S

    2009-01-01

    Some have suggested that "blood is different," and the role for cost-effectiveness is thus circumscribed. In this article, the authors start by reviewing key concepts in health economics and economic analysis methods. Examples are drawn from published blood safety studies. After explaining the underlying reasoning behind cost-effectiveness analysis, the authors point out how economic thinking is evident in some aspects of transfusion medicine. Some cost-effectiveness study results for blood safety are discussed to provide context, followed by consideration of prominent decisions that have been made in transfusion medicine field. In the last section, the authors conjecture as to why in some cases cost-effectiveness analysis appears to have greater impact than in others, noting the terrible price that is paid in mortality and morbidity when cost-effectiveness analysis is ignored. In this context, the implications of opportunity cost are discussed, and it is noted that opportunity cost should not be viewed as benefits forgone by concentrating on one aspect of blood safety and instead should be viewed as our societal willingness to misallocate resources to achieve less health for the same cost.

  11. The analysis of cost-effectiveness of implant and conventional fixed dental prosthesis.

    Science.gov (United States)

    Chun, June Sang; Har, Alix; Lim, Hyun-Pil; Lim, Hoi-Jeong

    2016-02-01

    This study conducted an analysis of cost-effectiveness of the implant and conventional fixed dental prosthesis (CFDP) from a single treatment perspective. The Markov model for cost-effectiveness analysis of the implant and CFDP was carried out over maximum 50 years. The probabilistic sensitivity analysis was performed by the 10,000 Monte-Carlo simulations, and cost-effectiveness acceptability curves (CEAC) were also presented. The results from meta-analysis studies were used to determine the survival rates and complication rates of the implant and CFDP. Data regarding the cost of each treatment method were collected from University Dental Hospital and Statistics Korea for 2013. Using the results of the patient satisfaction survey study, quality-adjusted prosthesis year (QAPY) of the implant and CFDP strategy was evaluated with annual discount rate. When only the direct cost was considered, implants were more cost-effective when the willingness to pay (WTP) was more than 10,000 won at 10(th) year after the treatment, and more cost-effective regardless of the WTP from 20(th) year after the prosthodontic treatment. When the indirect cost was added to the direct cost, implants were more cost-effective only when the WTP was more than 75,000 won at the 10(th) year after the prosthodontic treatment, more than 35,000 won at the 20(th) year after prosthodontic treatment. The CFDP was more cost-effective unless the WTP was more than 75,000 won at the 10(th) year after prosthodontic treatment. But the cost-effectivenss tendency changed from CFDP to implant as time passed.

  12. Breast cancer histopathology image analysis : a review

    NARCIS (Netherlands)

    Veta, M.; Pluim, J.P.W.; Diest, van P.J.; Viergever, M.A.

    2014-01-01

    This paper presents an overview of methods that have been proposed for the analysis of breast cancer histopathology images. This research area has become particularly relevant with the advent of whole slide imaging (WSI) scanners, which can perform cost-effective and high-throughput histopathology

  13. Costs and cost-effectiveness of periviable care.

    Science.gov (United States)

    Caughey, Aaron B; Burchfield, David J

    2014-02-01

    With increasing concerns regarding rapidly expanding healthcare costs, cost-effectiveness analysis allows assessment of whether marginal gains from new technology are worth the increased costs. Particular methodologic issues related to cost and cost-effectiveness analysis in the area of neonatal and periviable care include how costs are estimated, such as the use of charges and whether long-term costs are included; the challenges of measuring utilities; and whether to use a maternal, neonatal, or dual perspective in such analyses. A number of studies over the past three decades have examined the costs and the cost-effectiveness of neonatal and periviable care. Broadly, while neonatal care is costly, it is also cost effective as it produces both life-years and quality-adjusted life-years (QALYs). However, as the gestational age of the neonate decreases, the costs increase and the cost-effectiveness threshold is harder to achieve. In the periviable range of gestational age (22-24 weeks of gestation), whether the care is cost effective is questionable and is dependent on the perspective. Understanding the methodology and salient issues of cost-effectiveness analysis is critical for researchers, editors, and clinicians to accurately interpret results of the growing body of cost-effectiveness studies related to the care of periviable pregnancies and neonates. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. POEM is a cost-effective procedure: cost-utility analysis of endoscopic and surgical treatment options in the management of achalasia.

    Science.gov (United States)

    Miller, Heidi J; Neupane, Ruel; Fayezizadeh, Mojtaba; Majumder, Arnab; Marks, Jeffrey M

    2017-04-01

    Achalasia is a rare motility disorder of the esophagus. Treatment is palliative with the goal of symptom remission and slowing the progression of the disease. Treatment options include per oral endoscopic myotomy (POEM), laparoscopic Heller myotomy (LM) and endoscopic treatments such as pneumatic dilation (PD) and botulinum toxin type A injections (BI). We evaluate the economics and cost-effectiveness of treating achalasia. We performed cost analysis for POEM, LM, PD and BI at our institution from 2011 to 2015. Cost of LM was set to 1, and other procedures are presented as percentage change. Cost-effectiveness was calculated based on cost, number of interventions required for optimal results for dilations and injections and efficacy reported in the current literature. Incremental cost-effectiveness ratio was calculated by a cost-utility analysis using quality-adjusted life year gained, defined as a symptom-free year in a patient with achalasia. Average number of interventions required was 2.3 dilations or two injections for efficacies of 80 and 61 %, respectively. POEM cost 1.058 times the cost of LM, and PD and BI cost 0.559 and 0.448 times the cost of LM. Annual cost per cure over a period of 4 years for POEM, and LM were consistently equivalent, trending the same as PD although this has a lower initial cost. The cost per cure of BI remains stable over 3 years and then doubles. The cost-effectiveness of POEM and LM is equivalent. Myotomy, either surgical or endoscopic, is more cost-effective than BI due to high failure rates of the economical intervention. When treatment is being considered BI should be utilized in patients with less than 2-year life expectancy. Pneumatic dilations are cost-effective and are an acceptable approach to treatment of achalasia, although myotomy has a lower relapse rate and is cost-effective compared to PD after 2 years.

  15. High throughput production of mouse monoclonal antibodies using antigen microarrays

    DEFF Research Database (Denmark)

    De Masi, Federico; Chiarella, P.; Wilhelm, H.

    2005-01-01

    Recent advances in proteomics research underscore the increasing need for high-affinity monoclonal antibodies, which are still generated with lengthy, low-throughput antibody production techniques. Here we present a semi-automated, high-throughput method of hybridoma generation and identification....... Monoclonal antibodies were raised to different targets in single batch runs of 6-10 wk using multiplexed immunisations, automated fusion and cell-culture, and a novel antigen-coated microarray-screening assay. In a large-scale experiment, where eight mice were immunized with ten antigens each, we generated...

  16. Use of high-throughput mass spectrometry to elucidate host pathogen interactions in Salmonella

    Energy Technology Data Exchange (ETDEWEB)

    Rodland, Karin D.; Adkins, Joshua N.; Ansong, Charles; Chowdhury, Saiful M.; Manes, Nathan P.; Shi, Liang; Yoon, Hyunjin; Smith, Richard D.; Heffron, Fred

    2008-12-01

    Capabilities in mass spectrometry are evolving rapidly, with recent improvements in sensitivity, data analysis, and most important, from the standpoint of this review, much higher throughput allowing analysis of many samples in a single day. This short review describes how these improvements in mass spectrometry can be used to dissect host-pathogen interactions using Salmonella as a model system. This approach enabled direct identification of the majority of annotated Salmonella proteins, quantitation of expression changes under various in vitro growth conditions, and new insights into virulence and expression of Salmonella proteins within host cell cells. One of the most significant findings is that a very high percentage of the all annotated genes (>20%) in Salmonella are regulated post-transcriptionally. In addition, new and unexpected interactions have been identified for several Salmonella virulence regulators that involve protein-protein interactions, suggesting additional functions of these regulators in coordinating virulence expression. Overall high throughput mass spectrometry provides a new view of pathogen-host interactions emphasizing the protein products and defining how protein interactions determine the outcome of infection.

  17. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    Science.gov (United States)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  18. High-throughput phenotyping allows for QTL analysis of defense, symbiosis and development-related traits

    DEFF Research Database (Denmark)

    Hansen, Nina Eberhardtsen

    -throughput phenotyping of whole plants. Additionally, a system for automated confocal microscopy aiming at automated detection of infection thread formation as well as detection of lateral root and nodule primordia is being developed. The objective was to use both systems in genome wide association studies and mutant...... the analysis. Additional phenotyping of defense mutants revealed that MLO, which confers susceptibility towards Blumeria graminis in barley, is also a prime candidate for a S. trifoliorum susceptibility gene in Lotus....

  19. Toward Low-Cost, High-Energy Density, and High-Power Density Lithium-Ion Batteries

    Science.gov (United States)

    Li, Jianlin; Du, Zhijia; Ruther, Rose E.; AN, Seong Jin; David, Lamuel Abraham; Hays, Kevin; Wood, Marissa; Phillip, Nathan D.; Sheng, Yangping; Mao, Chengyu; Kalnaus, Sergiy; Daniel, Claus; Wood, David L.

    2017-09-01

    Reducing cost and increasing energy density are two barriers for widespread application of lithium-ion batteries in electric vehicles. Although the cost of electric vehicle batteries has been reduced by 70% from 2008 to 2015, the current battery pack cost (268/kWh in 2015) is still >2 times what the USABC targets (125/kWh). Even though many advancements in cell chemistry have been realized since the lithium-ion battery was first commercialized in 1991, few major breakthroughs have occurred in the past decade. Therefore, future cost reduction will rely on cell manufacturing and broader market acceptance. This article discusses three major aspects for cost reduction: (1) quality control to minimize scrap rate in cell manufacturing; (2) novel electrode processing and engineering to reduce processing cost and increase energy density and throughputs; and (3) material development and optimization for lithium-ion batteries with high-energy density. Insights on increasing energy and power densities of lithium-ion batteries are also addressed.

  20. High-Throughput Sequencing, a VersatileWeapon to Support Genome-Based Diagnosis in Infectious Diseases: Applications to Clinical Bacteriology

    Directory of Open Access Journals (Sweden)

    Ségolène Caboche

    2014-04-01

    Full Text Available The recent progresses of high-throughput sequencing (HTS technologies enable easy and cost-reduced access to whole genome sequencing (WGS or re-sequencing. HTS associated with adapted, automatic and fast bioinformatics solutions for sequencing applications promises an accurate and timely identification and characterization of pathogenic agents. Many studies have demonstrated that data obtained from HTS analysis have allowed genome-based diagnosis, which has been consistent with phenotypic observations. These proofs of concept are probably the first steps toward the future of clinical microbiology. From concept to routine use, many parameters need to be considered to promote HTS as a powerful tool to help physicians and clinicians in microbiological investigations. This review highlights the milestones to be completed toward this purpose.

  1. A Novel Multiparametric Drug-Scoring Method for High-Throughput Screening of 3D Multicellular Tumor Spheroids Using the Celigo Image Cytometer.

    Science.gov (United States)

    Cribbes, Scott; Kessel, Sarah; McMenemy, Scott; Qiu, Jean; Chan, Leo Li-Ying

    2017-06-01

    Three-dimensional (3D) tumor models have been increasingly used to investigate and characterize cancer drug compounds. The ability to perform high-throughput screening of 3D multicellular tumor spheroids (MCTS) can highly improve the efficiency and cost-effectiveness of discovering potential cancer drug candidates. Previously, the Celigo Image Cytometer has demonstrated a novel method for high-throughput screening of 3D multicellular tumor spheroids. In this work, we employed the Celigo Image Cytometer to examine the effects of 14 cancer drug compounds on 3D MCTS of the glioblastoma cell line U87MG in 384-well plates. Using parameters such as MCTS diameter and invasion area, growth and invasion were monitored for 9 and 3 d, respectively. Furthermore, fluorescent staining with calcein AM, propidium iodide, Hoechst 33342, and caspase 3/7 was performed at day 9 posttreatment to measure viability and apoptosis. Using the kinetic and endpoint data generated, we created a novel multiparametric drug-scoring system for 3D MCTS that can be used to identify and classify potential drug candidates earlier in the drug discovery process. Furthermore, the combination of quantitative and qualitative image data can be used to delineate differences between drugs that induce cytotoxic and cytostatic effects. The 3D MCTS-based multiparametric scoring method described here can provide an alternative screening method to better qualify tested drug compounds.

  2. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  3. A cost-effectiveness analysis of two different repositioning strategies for the prevention of pressure ulcers.

    Science.gov (United States)

    Marsden, Grace; Jones, Katie; Neilson, Julie; Avital, Liz; Collier, Mark; Stansby, Gerard

    2015-12-01

    To assess the cost effectiveness of two repositioning strategies and inform the 2014 National Institute for Health and Care Excellence clinical guideline recommendations on pressure ulcer prevention. Pressure ulcers are distressing events, caused when skin and underlying tissues are placed under pressure sufficient to impair blood supply. They can have a substantial impact on quality of life and have significant resource implications. Repositioning is a key prevention strategy, but can be resource intensive, leading to variation in practice. This economic analysis was conducted to identify the most cost-effective repositioning strategy for the prevention of pressure ulcers. The economic analysis took the form of a cost-utility model. The clinical inputs to the model were taken from a systematic review of clinical data. The population in the model was older people in a nursing home. The economic model was developed with members of the guideline development group and included costs borne by the UK National Health Service. Outcomes were expressed as costs and quality adjusted life years. Despite being marginally more clinically effective, alternating 2 and 4 hourly repositioning is not a cost-effective use of UK National Health Service resources (compared with 4 hourly repositioning) for this high risk group of patients at a cost-effectiveness threshold of £20,000 per quality adjusted life years. These results were used to inform the clinical guideline recommendations for those who are at high risk of developing pressure ulcers. © 2015 John Wiley & Sons Ltd.

  4. Can delivery systems use cost-effectiveness analysis to reduce healthcare costs and improve value?

    Science.gov (United States)

    Savitz, Lucy A; Savitz, Samuel T

    2016-01-01

    Understanding costs and ensuring that we demonstrate value in healthcare is a foundational presumption as we transform the way we deliver and pay for healthcare in the U.S. With a focus on population health and payment reforms underway, there is increased pressure to examine cost-effectiveness in healthcare delivery. Cost-effectiveness analysis (CEA) is a type of economic analysis comparing the costs and effects (i.e. health outcomes) of two or more treatment options. The result is expressed as a ratio where the denominator is the gain in health from a measure (e.g. years of life or quality-adjusted years of life) and the numerator is the incremental cost associated with that health gain. For higher cost interventions, the lower the ratio of costs to effects, the higher the value. While CEA is not new, the approach continues to be refined with enhanced statistical techniques and standardized methods. This article describes the CEA approach and also contrasts it to optional approaches, in order for readers to fully appreciate caveats and concerns. CEA as an economic evaluation tool can be easily misused owing to inappropriate assumptions, over reliance, and misapplication. Twelve issues to be considered in using CEA results to drive healthcare delivery decision-making are summarized. Appropriately recognizing both the strengths and the limitations of CEA is necessary for informed resource allocation in achieving the maximum value for healthcare services provided.

  5. Filtering high-throughput protein-protein interaction data using a combination of genomic features

    Directory of Open Access Journals (Sweden)

    Patil Ashwini

    2005-04-01

    Full Text Available Abstract Background Protein-protein interaction data used in the creation or prediction of molecular networks is usually obtained from large scale or high-throughput experiments. This experimental data is liable to contain a large number of spurious interactions. Hence, there is a need to validate the interactions and filter out the incorrect data before using them in prediction studies. Results In this study, we use a combination of 3 genomic features – structurally known interacting Pfam domains, Gene Ontology annotations and sequence homology – as a means to assign reliability to the protein-protein interactions in Saccharomyces cerevisiae determined by high-throughput experiments. Using Bayesian network approaches, we show that protein-protein interactions from high-throughput data supported by one or more genomic features have a higher likelihood ratio and hence are more likely to be real interactions. Our method has a high sensitivity (90% and good specificity (63%. We show that 56% of the interactions from high-throughput experiments in Saccharomyces cerevisiae have high reliability. We use the method to estimate the number of true interactions in the high-throughput protein-protein interaction data sets in Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens to be 27%, 18% and 68% respectively. Our results are available for searching and downloading at http://helix.protein.osaka-u.ac.jp/htp/. Conclusion A combination of genomic features that include sequence, structure and annotation information is a good predictor of true interactions in large and noisy high-throughput data sets. The method has a very high sensitivity and good specificity and can be used to assign a likelihood ratio, corresponding to the reliability, to each interaction.

  6. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  7. Wafer-Scale High-Throughput Ordered Growth of Vertically Aligned ZnO Nanowire Arrays

    KAUST Repository

    Wei, Yaguang

    2010-09-08

    This article presents an effective approach for patterned growth of vertically aligned ZnO nanowire (NW) arrays with high throughput and low cost at wafer scale without using cleanroom technology. Periodic hole patterns are generated using laser interference lithography on substrates coated with the photoresist SU-8. ZnO NWs are selectively grown through the holes via a low-temperature hydrothermal method without using a catalyst and with a superior control over orientation, location/density, and as-synthesized morphology. The development of textured ZnO seed layers for replacing single crystalline GaN and ZnO substrates extends the large-scale fabrication of vertically aligned ZnO NW arrays on substrates of other materials, such as polymers, Si, and glass. This combined approach demonstrates a novel method of manufacturing large-scale patterned one-dimensional nanostructures on various substrates for applications in energy harvesting, sensing, optoelectronics, and electronic devices. © 2010 American Chemical Society.

  8. High-throughput label-free detection of aggregate platelets with optofluidic time-stretch microscopy (Conference Presentation)

    Science.gov (United States)

    Jiang, Yiyue; Lei, Cheng; Yasumoto, Atsushi; Ito, Takuro; Guo, Baoshan; Kobayashi, Hirofumi; Ozeki, Yasuyuki; Yatomi, Yutaka; Goda, Keisuke

    2017-02-01

    According to WHO, approximately 10 million new cases of thrombotic disorders are diagnosed worldwide every year. In the U.S. and Europe, their related diseases kill more people than those from AIDS, prostate cancer, breast cancer and motor vehicle accidents combined. Although thrombotic disorders, especially arterial ones, mainly result from enhanced platelet aggregability in the vascular system, visual detection of platelet aggregates in vivo is not employed in clinical settings. Here we present a high-throughput label-free platelet aggregate detection method, aiming at the diagnosis and monitoring of thrombotic disorders in clinical settings. With optofluidic time-stretch microscopy with a spatial resolution of 780 nm and an ultrahigh linear scanning rate of 75 MHz, it is capable of detecting aggregated platelets in lysed blood which flows through a hydrodynamic-focusing microfluidic device at a high throughput of 10,000 particles/s. With digital image processing and statistical analysis, we are able to distinguish them from single platelets and other blood cells via morphological features. The detection results are compared with results of fluorescence-based detection (which is slow and inaccurate, but established). Our results indicate that the method holds promise for real-time, low-cost, label-free, and minimally invasive detection of platelet aggregates, which is potentially applicable to detection of platelet aggregates in vivo and to the diagnosis and monitoring of thrombotic disorders in clinical settings. This technique, if introduced clinically, may provide important clinical information in addition to that obtained by conventional techniques for thrombotic disorder diagnosis, including ex vivo platelet aggregation tests.

  9. Rapid Screening of Acetylcholinesterase Inhibitors by Effect-Directed Analysis Using LC × LC Fractionation, a High Throughput in Vitro Assay, and Parallel Identification by Time of Flight Mass Spectrometry.

    Science.gov (United States)

    Ouyang, Xiyu; Leonards, Pim E G; Tousova, Zuzana; Slobodnik, Jaroslav; de Boer, Jacob; Lamoree, Marja H

    2016-02-16

    Effect-directed analysis (EDA) is a useful tool to identify bioactive compounds in complex samples. However, identification in EDA is usually challenging, mainly due to limited separation power of the liquid chromatography based fractionation. In this study, comprehensive two-dimensional liquid chromatography (LC × LC) based microfractionation combined with parallel high resolution time of flight (HR-ToF) mass spectrometric detection and a high throughput acetylcholinesterase (AChE) assay was developed. The LC × LC fractionation method was validated using analytical standards and a C18 and pentafluorophenyl (PFP) stationary phase combination was selected for the two-dimensional separation and fractionation in four 96-well plates. The method was successfully applied to identify AChE inhibitors in a wastewater treatment plant (WWTP) effluent. Good orthogonality (>0.9) separation was achieved and three AChE inhibitors (tiapride, amisulpride, and lamotrigine), used as antipsychotic medicines, were identified and confirmed by two-dimensional retention alignment as well as their AChE inhibition activity.

  10. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods. - Highlights: • A detailed discussion on the template-matching algorithm was given. • The algorithm was tested on data from a NaI and a Si detector. • The algorithm was successfully implemented on high rate data from a HPGe detector. • The performance of the algorithm was compared with traditional shaping methods. • The advantage of the algorithm in active interrogation was discussed.

  11. A novel hanging spherical drop system for the generation of cellular spheroids and high throughput combinatorial drug screening.

    Science.gov (United States)

    Neto, A I; Correia, C R; Oliveira, M B; Rial-Hermida, M I; Alvarez-Lorenzo, C; Reis, R L; Mano, J F

    2015-04-01

    We propose a novel hanging spherical drop system for anchoring arrays of droplets of cell suspension based on the use of biomimetic superhydrophobic flat substrates, with controlled positional adhesion and minimum contact with a solid substrate. By facing down the platform, it was possible to generate independent spheroid bodies in a high throughput manner, in order to mimic in vivo tumour models on the lab-on-chip scale. To validate this system for drug screening purposes, the toxicity of the anti-cancer drug doxorubicin in cell spheroids was tested and compared to cells in 2D culture. The advantages presented by this platform, such as feasibility of the system and the ability to control the size uniformity of the spheroid, emphasize its potential to be used as a new low cost toolbox for high-throughput drug screening and in cell or tissue engineering.

  12. SNP-PHAGE – High throughput SNP discovery pipeline

    Directory of Open Access Journals (Sweden)

    Cregan Perry B

    2006-10-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs as defined here are single base sequence changes or short insertion/deletions between or within individuals of a given species. As a result of their abundance and the availability of high throughput analysis technologies SNP markers have begun to replace other traditional markers such as restriction fragment length polymorphisms (RFLPs, amplified fragment length polymorphisms (AFLPs and simple sequence repeats (SSRs or microsatellite markers for fine mapping and association studies in several species. For SNP discovery from chromatogram data, several bioinformatics programs have to be combined to generate an analysis pipeline. Results have to be stored in a relational database to facilitate interrogation through queries or to generate data for further analyses such as determination of linkage disequilibrium and identification of common haplotypes. Although these tasks are routinely performed by several groups, an integrated open source SNP discovery pipeline that can be easily adapted by new groups interested in SNP marker development is currently unavailable. Results We developed SNP-PHAGE (SNP discovery Pipeline with additional features for identification of common haplotypes within a sequence tagged site (Haplotype Analysis and GenBank (-dbSNP submissions. This tool was applied for analyzing sequence traces from diverse soybean genotypes to discover over 10,000 SNPs. This package was developed on UNIX/Linux platform, written in Perl and uses a MySQL database. Scripts to generate a user-friendly web interface are also provided with common queries for preliminary data analysis. A machine learning tool developed by this group for increasing the efficiency of SNP discovery is integrated as a part of this package as an optional feature. The SNP-PHAGE package is being made available open source at http://bfgl.anri.barc.usda.gov/ML/snp-phage/. Conclusion SNP-PHAGE provides a bioinformatics

  13. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    CERN Document Server

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2014-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  14. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    Science.gov (United States)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2015-05-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  15. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Muzaffar, Shahzad; Knight, Robert

    2015-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG). (paper)

  16. Gold nanoparticle-mediated (GNOME) laser perforation: a new method for a high-throughput analysis of gap junction intercellular coupling.

    Science.gov (United States)

    Begandt, Daniela; Bader, Almke; Antonopoulos, Georgios C; Schomaker, Markus; Kalies, Stefan; Meyer, Heiko; Ripken, Tammo; Ngezahayo, Anaclet

    2015-10-01

    The present report evaluates the advantages of using the gold nanoparticle-mediated laser perforation (GNOME LP) technique as a computer-controlled cell optoperforation to introduce Lucifer yellow (LY) into cells in order to analyze the gap junction coupling in cell monolayers. To permeabilize GM-7373 endothelial cells grown in a 24 multiwell plate with GNOME LP, a laser beam of 88 μm in diameter was applied in the presence of gold nanoparticles and LY. After 10 min to allow dye uptake and diffusion through gap junctions, we observed a LY-positive cell band of 179 ± 8 μm width. The presence of the gap junction channel blocker carbenoxolone during the optoperforation reduced the LY-positive band to 95 ± 6 μm. Additionally, a forskolin-related enhancement of gap junction coupling, recently found using the scrape loading technique, was also observed using GNOME LP. Further, an automatic cell imaging and a subsequent semi-automatic quantification of the images using a java-based ImageJ-plugin were performed in a high-throughput sequence. Moreover, the GNOME LP was used on cells such as RBE4 rat brain endothelial cells, which cannot be mechanically scraped as well as on three-dimensionally cultivated cells, opening the possibility to implement the GNOME LP technique for analysis of gap junction coupling in tissues. We conclude that the GNOME LP technique allows a high-throughput automated analysis of gap junction coupling in cells. Moreover this non-invasive technique could be used on monolayers that do not support mechanical scraping as well as on cells in tissue allowing an in vivo/ex vivo analysis of gap junction coupling.

  17. Management of frozen shoulder: a systematic review and cost-effectiveness analysis.

    Science.gov (United States)

    Maund, E; Craig, D; Suekarran, S; Neilson, Ar; Wright, K; Brealey, S; Dennis, L; Goodchild, L; Hanchard, N; Rangan, A; Richardson, G; Robertson, J; McDaid, C

    2012-01-01

    . A mixed-treatment comparison (MTC) was also undertaken. An economic decision model was intended, but was found to be implausible because of a lack of available evidence. Resource use was estimated from clinical advisors and combined with quality-adjusted life-years obtained through mapping to present tentative cost-effectiveness results. Thirty-one clinical effectiveness studies and one economic evaluation were included. The clinical effectiveness studies evaluated steroid injection, sodium hyaluronate, supervised neglect, physical therapy (mainly physiotherapy), acupuncture, MUA, distension and capsular release. Many of the studies identified were at high risk of bias. Because of variation in the interventions and comparators few studies could be pooled in a meta-analysis. Based on single RCTs, and for some outcomes only, short-wave diathermy may be more effective than home exercise. High-grade mobilisation may be more effective than low-grade mobilisation in a population in which most patients have already had treatment. Data from two RCTs showed that there may be benefit from adding a single intra-articular steroid injection to home exercise in patients with frozen shoulder of analysis was based on only a subset of the evidence, which may explain why the findings are only partly supportive of the main analysis. No studies of patients' views about the treatments were identified. Average costs ranged from £36.16 for unguided steroid injections to £2204 for capsular release. The findings of the mapping suggest a positive relationship between outcome and European Quality of Life-5 Dimensions (EQ-5D) score: a decreasing visual analogue scale score (less pain) was accompanied by an increasing (better) EQ-5D score. The one published economic evaluation suggested that low-grade mobilisation may be more cost-effective than high-grade mobilisation. Our tentative cost-effectiveness analysis suggested that steroid alone may be more cost-effective than steroid plus

  18. [Cost-effectiveness analysis of universal screening for thyroid disease in pregnant women in Spain].

    Science.gov (United States)

    Donnay Candil, Sergio; Balsa Barro, José Antonio; Álvarez Hernández, Julia; Crespo Palomo, Carlos; Pérez-Alcántara, Ferrán; Polanco Sánchez, Carlos

    2015-01-01

    To assess the cost-effectiveness of universal screening for thyroid disease in pregnant women in Spain as compared to high risk screening and no screening. A decision-analytic model comparing the incremental cost per quality-adjusted life year (QALY) of universal screening versus high risk screening and versus no screening. was used for the pregnancy and postpartum period. Probabilities from randomized controlled trials were considered for adverse obstetrical outcomes. A Markov model was used to assess the lifetime period after the first postpartum year and account for development of overt hypothyroidism. The main assumptions in the model and use of resources were assessed by local clinical experts. The analysis considered direct healthcare costs only. Universal screening gained .011 QALYs over high risk screening and .014 QALYS over no screening. Total direct costs per patient were €5,786 for universal screening, €5,791 for high risk screening, and €5,781 for no screening. Universal screening was dominant compared to risk-based screening and a very cost-effective alternative as compared to no screening. Use of universal screening instead of high risk screening would result in €2,653,854 annual savings for the Spanish National Health System. Universal screening for thyroid disease in pregnant women in the first trimester is dominant in Spain as compared to risk-based screening, and is cost-effective as compared to no screening (incremental cost-effectiveness ratio of €374 per QALY). Moreover, it allows diagnosing and treating cases of clinical and subclinical hypothyroidism that may not be detected when only high-risk women are screened. Copyright © 2014 SEEN. Published by Elsevier España, S.L.U. All rights reserved.

  19. High Throughput Computing Impact on Meta Genomics (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Gore, Brooklin

    2011-10-12

    This presentation includes a brief background on High Throughput Computing, correlating gene transcription factors, optical mapping, genotype to phenotype mapping via QTL analysis, and current work on next gen sequencing.

  20. Tuberculosis screening of travelers to higher-incidence countries: A cost-effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Menzies Dick

    2008-06-01

    Full Text Available Abstract Background Travelers to countries with high tuberculosis incidence can acquire infection during travel. We sought to compare four screening interventions for travelers from low-incidence countries, who visit countries with varying tuberculosis incidence. Methods Decision analysis model: We considered hypothetical cohorts of 1,000 travelers, 21 years old, visiting Mexico, the Dominican Republic, or Haiti for three months. Travelers departed from and returned to the United States or Canada; they were born in the United States, Canada, or the destination countries. The time horizon was 20 years, with 3% annual discounting of future costs and outcomes. The analysis was conducted from the health care system perspective. Screening involved tuberculin skin testing (post-travel in three strategies, with baseline pre-travel tests in two, or chest radiography post-travel (one strategy. Returning travelers with tuberculin conversion (one strategy or other evidence of latent tuberculosis (three strategies were offered treatment. The main outcome was cost (in 2005 US dollars per tuberculosis case prevented. Results For all travelers, a single post-trip tuberculin test was most cost-effective. The associated cost estimate per case prevented ranged from $21,406 for Haitian-born travelers to Haiti, to $161,196 for US-born travelers to Mexico. In all sensitivity analyses, the single post-trip tuberculin test remained most cost-effective. For US-born travelers to Haiti, this strategy was associated with cost savings for trips over 22 months. Screening was more cost-effective with increasing trip duration and infection risk, and less so with poorer treatment adherence. Conclusion A single post-trip tuberculin skin test was the most cost-effective strategy considered, for travelers from the United States or Canada. The analysis did not evaluate the use of interferon-gamma release assays, which would be most relevant for travelers who received BCG

  1. Low Cost Lithography Tool for High Brightness LED Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Andrew Hawryluk; Emily True

    2012-06-30

    The objective of this activity was to address the need for improved manufacturing tools for LEDs. Improvements include lower cost (both capital equipment cost reductions and cost-ofownership reductions), better automation and better yields. To meet the DOE objective of $1- 2/kilolumen, it will be necessary to develop these highly automated manufacturing tools. Lithography is used extensively in the fabrication of high-brightness LEDs, but the tools used to date are not scalable to high-volume manufacturing. This activity addressed the LED lithography process. During R&D and low volume manufacturing, most LED companies use contact-printers. However, several industries have shown that these printers are incompatible with high volume manufacturing and the LED industry needs to evolve to projection steppers. The need for projection lithography tools for LED manufacturing is identified in the Solid State Lighting Manufacturing Roadmap Draft, June 2009. The Roadmap states that Projection tools are needed by 2011. This work will modify a stepper, originally designed for semiconductor manufacturing, for use in LED manufacturing. This work addresses improvements to yield, material handling, automation and throughput for LED manufacturing while reducing the capital equipment cost.

  2. A high throughput screening assay for identifying glycation inhibitors on MALDI-TOF target.

    Science.gov (United States)

    Zhang, Qiuting; Tu, Zongcai; Wang, Hui; Fan, Liangliang; Huang, Xiaoqin; Xiao, Hui

    2015-03-01

    The Maillard reaction plays an important role in the food industry, however, the deleterious effects generated by the advanced glycation end-products (AGEs) have been well recognized. Many efforts have been made to seek new AGE inhibitors, in particular those natural ones without adverse effect. We have developed a rapid, mass spectrometry based, on-plate screening assay for novel AGE inhibitors. The glycation reaction, inhibition feedback as well as the subsequent MALDI mass spectrometric analysis occurred on one single MALDI plate. At 1:10 M ratio of peptide to sugar, as little as 4h incubation time allowed the screening test to be ready for analysis. DSP, inhibition and IC50 were calculated to evaluate selected inhibitors and resulting inhibition efficiencies were consistent with available references. We demonstrated that this method provide a potential high throughput screening assay to analyze and identify the anti-glycation agents. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  4. Cost Effectiveness Analysis of Optimal Malaria Control Strategies in Kenya

    Directory of Open Access Journals (Sweden)

    Gabriel Otieno

    2016-03-01

    Full Text Available Malaria remains a leading cause of mortality and morbidity among the children under five and pregnant women in sub-Saharan Africa, but it is preventable and controllable provided current recommended interventions are properly implemented. Better utilization of malaria intervention strategies will ensure the gain for the value for money and producing health improvements in the most cost effective way. The purpose of the value for money drive is to develop a better understanding (and better articulation of costs and results so that more informed, evidence-based choices could be made. Cost effectiveness analysis is carried out to inform decision makers on how to determine where to allocate resources for malaria interventions. This study carries out cost effective analysis of one or all possible combinations of the optimal malaria control strategies (Insecticide Treated Bednets—ITNs, Treatment, Indoor Residual Spray—IRS and Intermittent Preventive Treatment for Pregnant Women—IPTp for the four different transmission settings in order to assess the extent to which the intervention strategies are beneficial and cost effective. For the four different transmission settings in Kenya the optimal solution for the 15 strategies and their associated effectiveness are computed. Cost-effective analysis using Incremental Cost Effectiveness Ratio (ICER was done after ranking the strategies in order of the increasing effectiveness (total infections averted. The findings shows that for the endemic regions the combination of ITNs, IRS, and IPTp was the most cost-effective of all the combined strategies developed in this study for malaria disease control and prevention; for the epidemic prone areas is the combination of the treatment and IRS; for seasonal areas is the use of ITNs plus treatment; and for the low risk areas is the use of treatment only. Malaria transmission in Kenya can be minimized through tailor-made intervention strategies for malaria control

  5. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    BACKGROUND: Unbiased flow cytometry-based methods have become the technique of choice in many laboratories for high-throughput, accurate assessments of malaria parasites in bioassays. A method to quantify live parasites based on mitotracker red CMXRos was recently described but consistent...... distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...... for enumerating live parasites in bioassays was developed. The technique was applied to estimate the specific growth inhibition index (SGI) in the antibody-dependent cellular inhibition (ADCI) assay and compared to parasite quantification by microscopy and mitotracker red staining. The Bland-Altman analysis...

  6. Uplink SDMA with Limited Feedback: Throughput Scaling

    Directory of Open Access Journals (Sweden)

    Jeffrey G. Andrews

    2008-01-01

    Full Text Available Combined space division multiple access (SDMA and scheduling exploit both spatial multiplexing and multiuser diversity, increasing throughput significantly. Both SDMA and scheduling require feedback of multiuser channel sate information (CSI. This paper focuses on uplink SDMA with limited feedback, which refers to efficient techniques for CSI quantization and feedback. To quantify the throughput of uplink SDMA and derive design guidelines, the throughput scaling with system parameters is analyzed. The specific parameters considered include the numbers of users, antennas, and feedback bits. Furthermore, different SNR regimes and beamforming methods are considered. The derived throughput scaling laws are observed to change for different SNR regimes. For instance, the throughput scales logarithmically with the number of users in the high SNR regime but double logarithmically in the low SNR regime. The analysis of throughput scaling suggests guidelines for scheduling in uplink SDMA. For example, to maximize throughput scaling, scheduling should use the criterion of minimum quantization errors for the high SNR regime and maximum channel power for the low SNR regime.

  7. Cost-effectiveness analysis of radon remediation in schools

    International Nuclear Information System (INIS)

    Kennedy, C.A.; Gray, A.M.

    2000-01-01

    Indoor radon is an important source of radiation dosage in the general population and has been recognised as a world-wide environmental and public health challenge. Governments in many Western and Eastern European and North American countries are undertaking active radon-risk reduction policies, including the remediation of existing residential and work place building stocks (1). These endeavours include a priority of remediating school buildings. Epidemiological and technical radon research has produced information which has enabled attention to be turned to specific effectiveness and optimisation questions regarding radon identification and remediation programmes in buildings, including schools. Decision making about policy implementation has been an integral part of these programmes and questions have been raised about the economic implications of the regulations and optimisation strategies for workplace action level policy (2,3). (the action level applied to schools is 400 Bq m -3 ). No previous study has estimated the cost-effectiveness of a radon remediation programme for schools using the methodological framework now considered appropriate in the economic evaluation of health interventions. It is imperative that this should be done, in order that the resources required to obtain health gain from radon remediation in schools can be systematically compared with equivalent data for other health interventions and radon remediation programmes. In this study a cost-effectiveness analysis of radon remediation in schools was undertaken, using the best available national data and information from Northamptonshire on the costs and effectiveness of radon identification and remediation in schools, and the costs and health impact of lung cancer cases. A model based on data from Northamptonshire is presented (where 6.3% of residential stock is over 200 Bq m -3 ). The resultant cost-effectiveness ratio was pound 7,550 per life year gained in pound 1997. Results from the

  8. High-Throughput Cloning and Expression Library Creation for Functional Proteomics

    Science.gov (United States)

    Festa, Fernanda; Steel, Jason; Bian, Xiaofang; Labaer, Joshua

    2013-01-01

    The study of protein function usually requires the use of a cloned version of the gene for protein expression and functional assays. This strategy is particular important when the information available regarding function is limited. The functional characterization of the thousands of newly identified proteins revealed by genomics requires faster methods than traditional single gene experiments, creating the need for fast, flexible and reliable cloning systems. These collections of open reading frame (ORF) clones can be coupled with high-throughput proteomics platforms, such as protein microarrays and cell-based assays, to answer biological questions. In this tutorial we provide the background for DNA cloning, discuss the major high-throughput cloning systems (Gateway® Technology, Flexi® Vector Systems, and Creator™ DNA Cloning System) and compare them side-by-side. We also report an example of high-throughput cloning study and its application in functional proteomics. This Tutorial is part of the International Proteomics Tutorial Programme (IPTP12). Details can be found at http://www.proteomicstutorials.org. PMID:23457047

  9. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    Science.gov (United States)

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots

  10. High-throughput peptide mass fingerprinting and protein macroarray analysis using chemical printing strategies

    International Nuclear Information System (INIS)

    Sloane, A.J.; Duff, J.L.; Hopwood, F.G.; Wilson, N.L.; Smith, P.E.; Hill, C.J.; Packer, N.H.; Williams, K.L.; Gooley, A.A.; Cole, R.A.; Cooley, P.W.; Wallace, D.B.

    2001-01-01

    We describe a 'chemical printer' that uses piezoelectric pulsing for rapid and accurate microdispensing of picolitre volumes of fluid for proteomic analysis of 'protein macroarrays'. Unlike positive transfer and pin transfer systems, our printer dispenses fluid in a non-contact process that ensures that the fluid source cannot be contaminated by substrate during a printing event. We demonstrate automated delivery of enzyme and matrix solutions for on-membrane protein digestion and subsequent peptide mass fingerprinting (pmf) analysis directly from the membrane surface using matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). This approach bypasses the more commonly used multi-step procedures, thereby permitting a more rapid procedure for protein identification. We also highlight the advantage of printing different chemistries onto an individual protein spot for multiple microscale analyses. This ability is particularly useful when detailed characterisation of rare and valuable sample is required. Using a combination of PNGase F and trypsin we have mapped sites of N-glycosylation using on-membrane digestion strategies. We also demonstrate the ability to print multiple serum samples in a micro-ELISA format and rapidly screen a protein macroarray of human blood plasma for pathogen-derived antigens. We anticipate that the 'chemical printer' will be a major component of proteomic platforms for high-throughput protein identification and characterisation with widespread applications in biomedical and diagnostic discovery

  11. Better informing decision making with multiple outcomes cost-effectiveness analysis under uncertainty in cost-disutility space.

    Science.gov (United States)

    McCaffrey, Nikki; Agar, Meera; Harlum, Janeane; Karnon, Jonathon; Currow, David; Eckermann, Simon

    2015-01-01

    Comparing multiple, diverse outcomes with cost-effectiveness analysis (CEA) is important, yet challenging in areas like palliative care where domains are unamenable to integration with survival. Generic multi-attribute utility values exclude important domains and non-health outcomes, while partial analyses-where outcomes are considered separately, with their joint relationship under uncertainty ignored-lead to incorrect inference regarding preferred strategies. The objective of this paper is to consider whether such decision making can be better informed with alternative presentation and summary measures, extending methods previously shown to have advantages in multiple strategy comparison. Multiple outcomes CEA of a home-based palliative care model (PEACH) relative to usual care is undertaken in cost disutility (CDU) space and compared with analysis on the cost-effectiveness plane. Summary measures developed for comparing strategies across potential threshold values for multiple outcomes include: expected net loss (ENL) planes quantifying differences in expected net benefit; the ENL contour identifying preferred strategies minimising ENL and their expected value of perfect information; and cost-effectiveness acceptability planes showing probability of strategies minimising ENL. Conventional analysis suggests PEACH is cost-effective when the threshold value per additional day at home (1) exceeds $1,068 or dominated by usual care when only the proportion of home deaths is considered. In contrast, neither alternative dominate in CDU space where cost and outcomes are jointly considered, with the optimal strategy depending on threshold values. For example, PEACH minimises ENL when 1=$2,000 and 2=$2,000 (threshold value for dying at home), with a 51.6% chance of PEACH being cost-effective. Comparison in CDU space and associated summary measures have distinct advantages to multiple domain comparisons, aiding transparent and robust joint comparison of costs and multiple

  12. Maximizing gain in high-throughput screening using conformal prediction.

    Science.gov (United States)

    Svensson, Fredrik; Afzal, Avid M; Norinder, Ulf; Bender, Andreas

    2018-02-21

    Iterative screening has emerged as a promising approach to increase the efficiency of screening campaigns compared to traditional high throughput approaches. By learning from a subset of the compound library, inferences on what compounds to screen next can be made by predictive models, resulting in more efficient screening. One way to evaluate screening is to consider the cost of screening compared to the gain associated with finding an active compound. In this work, we introduce a conformal predictor coupled with a gain-cost function with the aim to maximise gain in iterative screening. Using this setup we were able to show that by evaluating the predictions on the training data, very accurate predictions on what settings will produce the highest gain on the test data can be made. We evaluate the approach on 12 bioactivity datasets from PubChem training the models using 20% of the data. Depending on the settings of the gain-cost function, the settings generating the maximum gain were accurately identified in 8-10 out of the 12 datasets. Broadly, our approach can predict what strategy generates the highest gain based on the results of the cost-gain evaluation: to screen the compounds predicted to be active, to screen all the remaining data, or not to screen any additional compounds. When the algorithm indicates that the predicted active compounds should be screened, our approach also indicates what confidence level to apply in order to maximize gain. Hence, our approach facilitates decision-making and allocation of the resources where they deliver the most value by indicating in advance the likely outcome of a screening campaign.

  13. Recent Advances in Nanobiotechnology and High-Throughput Molecular Techniques for Systems Biomedicine

    Science.gov (United States)

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-01-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011

  14. Costs and cost-effectiveness of pediatric inguinal hernia repair in Uganda.

    Science.gov (United States)

    Eeson, Gareth; Birabwa-Male, Doreen; Pennington, Mark; Blair, Geoffrey K

    2015-02-01

    Surgically treatable diseases contribute approximately 11% of disability-adjusted life years (DALYs) worldwide yet they remain a neglected public health priority in low- and middle-income countries (LMICs). Pediatric inguinal hernia is the most common congenital abnormality in newborns and a major cause of morbidity and mortality yet elective repair remains largely unavailable in LMICs. This study is aimed to determine the costs and cost-effectiveness of pediatric inguinal hernia repair (PIHR) in a low-resource setting. Medical costs of consecutive elective PIHRs were recorded prospectively at two centers in Uganda. Decision modeling was used to compare two different treatment scenarios (adoption of PIHR and non-adoption) from a provider perspective. A Markov model was constructed to estimate health outcomes under each scenario. The robustness of the cost-effectiveness results in the base case analysis was tested in one-way and probabilistic sensitivity analysis. The primary outcome of interest was cost per DALY averted by the intervention. Sixty-nine PIHRs were performed in 65 children (mean age 3.6 years). Mean cost per procedure was $86.68 US (95% CI 83.1-90.2 USD) and averted an average of 5.7 DALYs each. Incremental cost-effectiveness ratio was $12.41 per DALY averted. The probability of cost-effectiveness was 95% at a cost-effectiveness threshold of $35 per averted DALY. Results were robust to sensitivity analysis under all considered scenarios. Elective PIHR is highly cost-effective for the treatment and prevention of complications of hernia disease even in low-resource settings. PIHR should be prioritized in LMICs alongside other cost-effective interventions.

  15. High throughput miniature drug-screening platform using bioprinting technology

    International Nuclear Information System (INIS)

    Rodríguez-Dévora, Jorge I; Reyna, Daniel; Xu Tao; Zhang Bimeng; Shi Zhidong

    2012-01-01

    In the pharmaceutical industry, new drugs are tested to find appropriate compounds for therapeutic purposes for contemporary diseases. Unfortunately, novel compounds emerge at expensive prices and current target evaluation processes have limited throughput, thus leading to an increase of cost and time for drug development. This work shows the development of the novel inkjet-based deposition method for assembling a miniature drug-screening platform, which can realistically and inexpensively evaluate biochemical reactions in a picoliter-scale volume at a high speed rate. As proof of concept, applying a modified Hewlett Packard model 5360 compact disc printer, green fluorescent protein expressing Escherichia coli cells along with alginate gel solution have been arrayed on a coverslip chip under a repeatable volume of 180% ± 26% picoliters per droplet; subsequently, different antibiotic droplets were patterned on the spots of cells to evaluate the inhibition of bacteria for antibiotic screening. The proposed platform was compared to the current screening process, validating its effectiveness. The viability and basic function of the printed cells were evaluated, resulting in cell viability above 98% and insignificant or no DNA damage to human kidney cells transfected. Based on the reduction of investment and compound volume used by this platform, this technique has the potential to improve the actual drug discovery process at its target evaluation stage. (paper)

  16. Raman-Activated Droplet Sorting (RADS) for Label-Free High-Throughput Screening of Microalgal Single-Cells.

    Science.gov (United States)

    Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo

    2017-11-21

    Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.

  17. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  18. Multiplexing a high-throughput liability assay to leverage efficiencies.

    Science.gov (United States)

    Herbst, John; Anthony, Monique; Stewart, Jeremy; Connors, David; Chen, Taosheng; Banks, Martyn; Petrillo, Edward W; Agler, Michele

    2009-06-01

    In order to identify potential cytochrome P-450 3A4 (drug-metabolizing enzyme) inducers at an early stage of the drug discovery process, a cell-based transactivation high-throughput luciferase reporter assay for the human pregnane X receptor (PXR) in HepG2 cells has been implemented and multiplexed with a viability end point for data interpretation, as part of a Lead Profiling portfolio of assays. As a routine part of Lead Profiling operations, assays are periodically evaluated for utility as well as for potential improvements in technology or process. We used a recent evaluation of our PXR-transactivation assay as a model for the application of Lean Thinking-based process analysis to lab-bench assay optimization and automation. This resulted in the development of a 384-well multiplexed homogeneous assay simultaneously detecting PXR transactivation and HepG2 cell cytotoxicity. In order to multiplex fluorescent and luminescent read-outs, modifications to each assay were necessary, which included optimization of multiple assay parameters such as cell density, plate type, and reagent concentrations. Subsequently, a set of compounds including known cytotoxic compounds and PXR inducers were used to validate the multiplexed assay. Results from the multiplexed assay correlate well with those from the singleplexed assay formats measuring PXR transactivation and viability separately. Implementation of the multiplexed assay for routine compound profiling provides improved data quality, sample conservation, cost savings, and resource efficiencies.

  19. High-throughput cloning and expression in recalcitrant bacteria

    NARCIS (Netherlands)

    Geertsma, Eric R.; Poolman, Bert

    We developed a generic method for high-throughput cloning in bacteria that are less amenable to conventional DNA manipulations. The method involves ligation-independent cloning in an intermediary Escherichia coli vector, which is rapidly converted via vector-backbone exchange (VBEx) into an

  20. The simple fool's guide to population genomics via RNA-Seq: An introduction to high-throughput sequencing data analysis

    DEFF Research Database (Denmark)

    De Wit, P.; Pespeni, M.H.; Ladner, J.T.

    2012-01-01

    to Population Genomics via RNA-seq' (SFG), a document intended to serve as an easy-to-follow protocol, walking a user through one example of high-throughput sequencing data analysis of nonmodel organisms. It is by no means an exhaustive protocol, but rather serves as an introduction to the bioinformatic methods...... used in population genomics, enabling a user to gain familiarity with basic analysis steps. The SFG consists of two parts. This document summarizes the steps needed and lays out the basic themes for each and a simple approach to follow. The second document is the full SFG, publicly available at http://sfg.......stanford.edu, that includes detailed protocols for data processing and analysis, along with a repository of custom-made scripts and sample files. Steps included in the SFG range from tissue collection to de novo assembly, blast annotation, alignment, gene expression, functional enrichment, SNP detection, principal components...

  1. Design and construction of a first-generation high-throughput integrated robotic molecular biology platform for bioenergy applications.

    Science.gov (United States)

    Hughes, Stephen R; Butt, Tauseef R; Bartolett, Scott; Riedmuller, Steven B; Farrelly, Philip

    2011-08-01

    The molecular biological techniques for plasmid-based assembly and cloning of gene open reading frames are essential for elucidating the function of the proteins encoded by the genes. High-throughput integrated robotic molecular biology platforms that have the capacity to rapidly clone and express heterologous gene open reading frames in bacteria and yeast and to screen large numbers of expressed proteins for optimized function are an important technology for improving microbial strains for biofuel production. The process involves the production of full-length complementary DNA libraries as a source of plasmid-based clones to express the desired proteins in active form for determination of their functions. Proteins that were identified by high-throughput screening as having desired characteristics are overexpressed in microbes to enable them to perform functions that will allow more cost-effective and sustainable production of biofuels. Because the plasmid libraries are composed of several thousand unique genes, automation of the process is essential. This review describes the design and implementation of an automated integrated programmable robotic workcell capable of producing complementary DNA libraries, colony picking, isolating plasmid DNA, transforming yeast and bacteria, expressing protein, and performing appropriate functional assays. These operations will allow tailoring microbial strains to use renewable feedstocks for production of biofuels, bioderived chemicals, fertilizers, and other coproducts for profitable and sustainable biorefineries. Published by Elsevier Inc.

  2. Cost and cost effectiveness of long-lasting insecticide-treated bed nets - a model-based analysis

    Directory of Open Access Journals (Sweden)

    Pulkki-Brännström Anni-Maria

    2012-04-01

    Full Text Available Abstract Background The World Health Organization recommends that national malaria programmes universally distribute long-lasting insecticide-treated bed nets (LLINs. LLINs provide effective insecticide protection for at least three years while conventional nets must be retreated every 6-12 months. LLINs may also promise longer physical durability (lifespan, but at a higher unit price. No prospective data currently available is sufficient to calculate the comparative cost effectiveness of different net types. We thus constructed a model to explore the cost effectiveness of LLINs, asking how a longer lifespan affects the relative cost effectiveness of nets, and if, when and why LLINs might be preferred to conventional insecticide-treated nets. An innovation of our model is that we also considered the replenishment need i.e. loss of nets over time. Methods We modelled the choice of net over a 10-year period to facilitate the comparison of nets with different lifespan (and/or price and replenishment need over time. Our base case represents a large-scale programme which achieves high coverage and usage throughout the population by distributing either LLINs or conventional nets through existing health services, and retreats a large proportion of conventional nets regularly at low cost. We identified the determinants of bed net programme cost effectiveness and parameter values for usage rate, delivery and retreatment cost from the literature. One-way sensitivity analysis was conducted to explicitly compare the differential effect of changing parameters such as price, lifespan, usage and replenishment need. Results If conventional and long-lasting bed nets have the same physical lifespan (3 years, LLINs are more cost effective unless they are priced at more than USD 1.5 above the price of conventional nets. Because a longer lifespan brings delivery cost savings, each one year increase in lifespan can be accompanied by a USD 1 or more increase in price

  3. Analyzing the Cost-Effectiveness of Instruction Expenditures towards High School Completion among Oahu's Public School Districts

    Science.gov (United States)

    Ng, Larson S. W. M.

    2011-01-01

    The following study attempted to ascertain the instructional cost-effectiveness of public high school teachers towards high school completion through a financially based econometric analysis. Essentially, public high school instruction expenditures and completer data were collected from 2000 to 2007 and bivariate interaction analyzed through a…

  4. Effects of Perfluorooctanoic Acid on Metabolic Profiles in Brain and Liver of Mouse Revealed by a High-throughput Targeted Metabolomics Approach

    Science.gov (United States)

    Yu, Nanyang; Wei, Si; Li, Meiying; Yang, Jingping; Li, Kan; Jin, Ling; Xie, Yuwei; Giesy, John P.; Zhang, Xiaowei; Yu, Hongxia

    2016-04-01

    Perfluorooctanoic acid (PFOA), a perfluoroalkyl acid, can result in hepatotoxicity and neurobehavioral effects in animals. The metabolome, which serves as a connection among transcriptome, proteome and toxic effects, provides pathway-based insights into effects of PFOA. Since understanding of changes in the metabolic profile during hepatotoxicity and neurotoxicity were still incomplete, a high-throughput targeted metabolomics approach (278 metabolites) was used to investigate effects of exposure to PFOA for 28 d on brain and liver of male Balb/c mice. Results of multivariate statistical analysis indicated that PFOA caused alterations in metabolic pathways in exposed individuals. Pathway analysis suggested that PFOA affected metabolism of amino acids, lipids, carbohydrates and energetics. Ten and 18 metabolites were identified as potential unique biomarkers of exposure to PFOA in brain and liver, respectively. In brain, PFOA affected concentrations of neurotransmitters, including serotonin, dopamine, norepinephrine, and glutamate in brain, which provides novel insights into mechanisms of PFOA-induced neurobehavioral effects. In liver, profiles of lipids revealed involvement of β-oxidation and biosynthesis of saturated and unsaturated fatty acids in PFOA-induced hepatotoxicity, while alterations in metabolism of arachidonic acid suggesting potential of PFOA to cause inflammation response in liver. These results provide insight into the mechanism and biomarkers for PFOA-induced effects.

  5. A Reference Viral Database (RVDB) To Enhance Bioinformatics Analysis of High-Throughput Sequencing for Novel Virus Detection.

    Science.gov (United States)

    Goodacre, Norman; Aljanahi, Aisha; Nandakumar, Subhiksha; Mikailov, Mike; Khan, Arifa S

    2018-01-01

    Detection of distantly related viruses by high-throughput sequencing (HTS) is bioinformatically challenging because of the lack of a public database containing all viral sequences, without abundant nonviral sequences, which can extend runtime and obscure viral hits. Our reference viral database (RVDB) includes all viral, virus-related, and virus-like nucleotide sequences (excluding bacterial viruses), regardless of length, and with overall reduced cellular sequences. Semantic selection criteria (SEM-I) were used to select viral sequences from GenBank, resulting in a first-generation viral database (VDB). This database was manually and computationally reviewed, resulting in refined, semantic selection criteria (SEM-R), which were applied to a new download of updated GenBank sequences to create a second-generation VDB. Viral entries in the latter were clustered at 98% by CD-HIT-EST to reduce redundancy while retaining high viral sequence diversity. The viral identity of the clustered representative sequences (creps) was confirmed by BLAST searches in NCBI databases and HMMER searches in PFAM and DFAM databases. The resulting RVDB contained a broad representation of viral families, sequence diversity, and a reduced cellular content; it includes full-length and partial sequences and endogenous nonretroviral elements, endogenous retroviruses, and retrotransposons. Testing of RVDBv10.2, with an in-house HTS transcriptomic data set indicated a significantly faster run for virus detection than interrogating the entirety of the NCBI nonredundant nucleotide database, which contains all viral sequences but also nonviral sequences. RVDB is publically available for facilitating HTS analysis, particularly for novel virus detection. It is meant to be updated on a regular basis to include new viral sequences added to GenBank. IMPORTANCE To facilitate bioinformatics analysis of high-throughput sequencing (HTS) data for the detection of both known and novel viruses, we have

  6. Fluorescence-based high-throughput screening of dicer cleavage activity.

    Science.gov (United States)

    Podolska, Katerina; Sedlak, David; Bartunek, Petr; Svoboda, Petr

    2014-03-01

    Production of small RNAs by ribonuclease III Dicer is a key step in microRNA and RNA interference pathways, which employ Dicer-produced small RNAs as sequence-specific silencing guides. Further studies and manipulations of microRNA and RNA interference pathways would benefit from identification of small-molecule modulators. Here, we report a study of a fluorescence-based in vitro Dicer cleavage assay, which was adapted for high-throughput screening. The kinetic assay can be performed under single-turnover conditions (35 nM substrate and 70 nM Dicer) in a small volume (5 µL), which makes it suitable for high-throughput screening in a 1536-well format. As a proof of principle, a small library of bioactive compounds was analyzed, demonstrating potential of the assay.

  7. Cost-effectiveness Analysis of Antipsychotic Combination Therapy in Schizophrenia Inpatients

    Directory of Open Access Journals (Sweden)

    Rizky Abdulah

    2017-03-01

    Full Text Available Schizophrenia is one of mental disorders with high cost and lifetime morbidity risk. Hence, it is necessary to analyze the cost-effectiveness of various combinations of antipsychotics. The aim of this study was to analyze the most cost-effective group of antipsychotic combinations in schizophrenia inpatients in West Java Psychiatric Hospital during 2012–2013. Data were collected retrospectively from medical record of patients who used antipsychotics clozapine-haloperidol or clozapine-risperidone therapy. Direct medical costs were obtained from antipsychotics costs, costs of medical treatment, medical expenses, hospitalization costs, and administrative costs. The results showed that the average cost-effectiveness ratio of antipsychotic clozapine-haloperidol was Rp126.898/day and Rp132.781/day for the combination of clozapine-haloperidol and clozapine-risperidone, respectively. Considering length of stay as the therapy effectiveness, it can be concluded that the combination of clozapine-haloperidol is more cost-effective than clozapine-risperidone.

  8. Advances in High-Throughput Speed, Low-Latency Communication for Embedded Instrumentation (7th Annual SFAF Meeting, 2012)

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Scott

    2012-06-01

    Scott Jordan on "Advances in high-throughput speed, low-latency communication for embedded instrumentation" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  9. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  10. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  11. Reconstruction versus conservative treatment after rupture of the anterior cruciate ligament: cost effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Farshad Mazda

    2011-11-01

    Full Text Available Abstract Background The decision whether to treat conservatively or reconstruct surgically a torn anterior cruciate ligament (ACL is an ongoing subject of debate. The high prevalence and associated public health burden of torn ACL has led to continuous efforts to determine the best therapeutic approach. A critical evaluation of benefits and expenditures of both treatment options as in a cost effectiveness analysis seems well-suited to provide valuable information for treating physicians and healthcare policymakers. Methods A literature review identified four of 7410 searched articles providing sufficient outcome probabilities for the two treatment options for modeling. A transformation key based on the expert opinions of 25 orthopedic surgeons was used to derive utilities from available evidence. The cost data for both treatment strategies were based on average figures compiled by Orthopaedic University Hospital Balgrist and reinforced by Swiss national statistics. A decision tree was constructed to derive the cost-effectiveness of each strategy, which was then tested for robustness using Monte Carlo simulation. Results Decision tree analysis revealed a cost effectiveness of 16,038 USD/0.78 QALY for ACL reconstruction and 15,466 USD/0.66 QALY for conservative treatment, implying an incremental cost effectiveness of 4,890 USD/QALY for ACL reconstruction. Sensitivity analysis of utilities did not change the trend. Conclusion ACL reconstruction for reestablishment of knee stability seems cost effective in the Swiss setting based on currently available evidence. This, however, should be reinforced with randomized controlled trials comparing the two treatment strategies.

  12. Label-free detection of cellular drug responses by high-throughput bright-field imaging and machine learning.

    Science.gov (United States)

    Kobayashi, Hirofumi; Lei, Cheng; Wu, Yi; Mao, Ailin; Jiang, Yiyue; Guo, Baoshan; Ozeki, Yasuyuki; Goda, Keisuke

    2017-09-29

    In the last decade, high-content screening based on multivariate single-cell imaging has been proven effective in drug discovery to evaluate drug-induced phenotypic variations. Unfortunately, this method inherently requires fluorescent labeling which has several drawbacks. Here we present a label-free method for evaluating cellular drug responses only by high-throughput bright-field imaging with the aid of machine learning algorithms. Specifically, we performed high-throughput bright-field imaging of numerous drug-treated and -untreated cells (N = ~240,000) by optofluidic time-stretch microscopy with high throughput up to 10,000 cells/s and applied machine learning to the cell images to identify their morphological variations which are too subtle for human eyes to detect. Consequently, we achieved a high accuracy of 92% in distinguishing drug-treated and -untreated cells without the need for labeling. Furthermore, we also demonstrated that dose-dependent, drug-induced morphological change from different experiments can be inferred from the classification accuracy of a single classification model. Our work lays the groundwork for label-free drug screening in pharmaceutical science and industry.

  13. Cost-Effectiveness Analysis of Second-Line Chemotherapy Agents for Advanced Gastric Cancer.

    Science.gov (United States)

    Lam, Simon W; Wai, Maya; Lau, Jessica E; McNamara, Michael; Earl, Marc; Udeh, Belinda

    2017-01-01

    Gastric cancer is the fifth most common malignancy and second leading cause of cancer-related mortality. Chemotherapy options for patients who fail first-line treatment are limited. Thus the objective of this study was to assess the cost-effectiveness of second-line treatment options for patients with advanced or metastatic gastric cancer. Cost-effectiveness analysis using a Markov model to compare the cost-effectiveness of six possible second-line treatment options for patients with advanced gastric cancer who have failed previous chemotherapy: irinotecan, docetaxel, paclitaxel, ramucirumab, paclitaxel plus ramucirumab, and palliative care. The model was performed from a third-party payer's perspective to compare lifetime costs and health benefits associated with studied second-line therapies. Costs included only relevant direct medical costs. The model assumed chemotherapy cycle lengths of 30 days and a maximum number of 24 cycles. Systematic review of literature was performed to identify clinical data sources and utility and cost data. Quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios (ICERs) were calculated. The primary outcome measure for this analysis was the ICER between different therapies, where the incremental cost was divided by the number of QALYs saved. The ICER was compared with a willingness-to-pay (WTP) threshold that was set at $50,000/QALY gained, and an exploratory analysis using $160,000/QALY gained was also used. The model's robustness was tested by using 1-way sensitivity analyses and a 10,000 Monte Carlo simulation probabilistic sensitivity analysis (PSA). Irinotecan had the lowest lifetime cost and was associated with a QALY gain of 0.35 year. Docetaxel, ramucirumab alone, and palliative care were dominated strategies. Paclitaxel and the combination of paclitaxel plus ramucirumab led to higher QALYs gained, at an incremental cost of $86,815 and $1,056,125 per QALY gained, respectively. Based on our prespecified

  14. Noise and non-linearities in high-throughput data

    International Nuclear Information System (INIS)

    Nguyen, Viet-Anh; Lió, Pietro; Koukolíková-Nicola, Zdena; Bagnoli, Franco

    2009-01-01

    High-throughput data analyses are becoming common in biology, communications, economics and sociology. The vast amounts of data are usually represented in the form of matrices and can be considered as knowledge networks. Spectra-based approaches have proved useful in extracting hidden information within such networks and for estimating missing data, but these methods are based essentially on linear assumptions. The physical models of matching, when applicable, often suggest non-linear mechanisms, that may sometimes be identified as noise. The use of non-linear models in data analysis, however, may require the introduction of many parameters, which lowers the statistical weight of the model. According to the quality of data, a simpler linear analysis may be more convenient than more complex approaches. In this paper, we show how a simple non-parametric Bayesian model may be used to explore the role of non-linearities and noise in synthetic and experimental data sets

  15. Evaluation of Capacity on a High Throughput Vol-oxidizer for Operability

    International Nuclear Information System (INIS)

    Kim, Young Hwan; Park, Geun Il; Lee, Jung Won; Jung, Jae Hoo; Kim, Ki Ho; Lee, Yong Soon; Lee, Do Youn; Kim, Su Sung

    2010-01-01

    KAERI is developing a pyro-process. As a piece of process equipment, a high throughput vol-oxidizer which can handle a several tens kg HM/batch was developed to supply U 3 O 8 powders to an electrolytic reduction(ER) reactor. To increase the reduction yield, UO 2 pellets should be converted into uniform powders. In this paper, we aim at the evaluation of a high throughput vol-oxidizer for operability. The evaluation consisted of 3 targets, a mechanical motion test, a heating test and hull separation test. In order to test a high throughput vol-oxidizer, By using a control system, mechanical motion tests of the vol-oxidizer were conducted, and heating rates were analyzed. Also the separation tests of hulls for recovery rate were conducted. The test results of the vol-oxidizer are going to be applied for operability. A study on the characteristics of the volatile gas produced during a vol-oxidation process is not included in this study

  16. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    Science.gov (United States)

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  17. Fun with High Throughput Toxicokinetics (CalEPA webinar)

    Science.gov (United States)

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  18. Caveats and limitations of plate reader-based high-throughput kinetic measurements of intracellular calcium levels

    International Nuclear Information System (INIS)

    Heusinkveld, Harm J.; Westerink, Remco H.S.

    2011-01-01

    Calcium plays a crucial role in virtually all cellular processes, including neurotransmission. The intracellular Ca 2+ concentration ([Ca 2+ ] i ) is therefore an important readout in neurotoxicological and neuropharmacological studies. Consequently, there is an increasing demand for high-throughput measurements of [Ca 2+ ] i , e.g. using multi-well microplate readers, in hazard characterization, human risk assessment and drug development. However, changes in [Ca 2+ ] i are highly dynamic, thereby creating challenges for high-throughput measurements. Nonetheless, several protocols are now available for real-time kinetic measurement of [Ca 2+ ] i in plate reader systems, though the results of such plate reader-based measurements have been questioned. In view of the increasing use of plate reader systems for measurements of [Ca 2+ ] i a careful evaluation of current technologies is warranted. We therefore performed an extensive set of experiments, using two cell lines (PC12 and B35) and two fluorescent calcium-sensitive dyes (Fluo-4 and Fura-2), for comparison of a linear plate reader system with single cell fluorescence microscopy. Our data demonstrate that the use of plate reader systems for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with many pitfalls and limitations, including erroneous sustained increases in fluorescence, limited sensitivity and lack of single cell resolution. Additionally, our data demonstrate that probenecid, which is often used to prevent dye leakage, effectively inhibits the depolarization-evoked increase in [Ca 2+ ] i . Overall, the data indicate that the use of current plate reader-based strategies for high-throughput real-time kinetic measurements of [Ca 2+ ] i is associated with caveats and limitations that require further investigation. - Research highlights: → The use of plate readers for high-throughput screening of intracellular Ca 2+ is associated with many pitfalls and limitations. → Single cell

  19. High-throughput DNA methylation analysis in anorexia nervosa confirms TNXB hypermethylation.

    Science.gov (United States)

    Kesselmeier, Miriam; Pütter, Carolin; Volckmar, Anna-Lena; Baurecht, Hansjörg; Grallert, Harald; Illig, Thomas; Ismail, Khadeeja; Ollikainen, Miina; Silén, Yasmina; Keski-Rahkonen, Anna; Bulik, Cynthia M; Collier, David A; Zeggini, Eleftheria; Hebebrand, Johannes; Scherag, André; Hinney, Anke

    2018-04-01

    Patients with anorexia nervosa (AN) are ideally suited to identify differentially methylated genes in response to starvation. We examined high-throughput DNA methylation derived from whole blood of 47 females with AN, 47 lean females without AN and 100 population-based females to compare AN with both controls. To account for different cell type compositions, we applied two reference-free methods (FastLMM-EWASher, RefFreeEWAS) and searched for consensus CpG sites identified by both methods. We used a validation sample of five monozygotic AN-discordant twin pairs. Fifty-one consensus sites were identified in AN vs. lean and 81 in AN vs. population-based comparisons. These sites have not been reported in AN methylation analyses, but for the latter comparison 54/81 sites showed directionally consistent differential methylation effects in the AN-discordant twins. For a single nucleotide polymorphism rs923768 in CSGALNACT1 a nearby site was nominally associated with AN. At the gene level, we confirmed hypermethylated sites at TNXB. We found support for a locus at NR1H3 in the AN vs. lean control comparison, but the methylation direction was opposite to the one previously reported. We confirm genes like TNXB previously described to comprise differentially methylated sites, and highlight further sites that might be specifically involved in AN starvation processes.

  20. Pressure relieving support surfaces (PRESSURE) trial: cost effectiveness analysis.

    Science.gov (United States)

    Iglesias, Cynthia; Nixon, Jane; Cranny, Gillian; Nelson, E Andrea; Hawkins, Kim; Phillips, Angela; Torgerson, David; Mason, Su; Cullum, Nicky

    2006-06-17

    To assess the cost effectiveness of alternating pressure mattresses compared with alternating pressure overlays for the prevention of pressure ulcers in patients admitted to hospital. Cost effectiveness analysis carried out alongside the pressure relieving support surfaces (PRESSURE) trial; a multicentre UK based pragmatic randomised controlled trial. 11 hospitals in six UK NHS trusts. Intention to treat population comprising 1971 participants. Kaplan Meier estimates of restricted mean time to development of pressure ulcers and total costs for treatment in hospital. Alternating pressure mattresses were associated with lower overall costs (283.6 pounds sterling per patient on average, 95% confidence interval--377.59 pounds sterling to 976.79 pounds sterling) mainly due to reduced length of stay in hospital, and greater benefits (a delay in time to ulceration of 10.64 days on average,--24.40 to 3.09). The differences in health benefits and total costs for hospital stay between alternating pressure mattresses and alternating pressure overlays were not statistically significant; however, a cost effectiveness acceptability curve indicated that on average alternating pressure mattresses compared with alternating pressure overlays were associated with an 80% probability of being cost saving. Alternating pressure mattresses for the prevention of pressure ulcers are more likely to be cost effective and are more acceptable to patients than alternating pressure overlays.