WorldWideScience

Sample records for cost-effective high-throughput analysis

  1. FIM imaging and FIMtrack: two new tools allowing high-throughput and cost effective locomotion analysis.

    Science.gov (United States)

    Risse, Benjamin; Otto, Nils; Berh, Dimitri; Jiang, Xiaoyi; Klämbt, Christian

    2014-12-24

    The analysis of neuronal network function requires a reliable measurement of behavioral traits. Since the behavior of freely moving animals is variable to a certain degree, many animals have to be analyzed, to obtain statistically significant data. This in turn requires a computer assisted automated quantification of locomotion patterns. To obtain high contrast images of almost translucent and small moving objects, a novel imaging technique based on frustrated total internal reflection called FIM was developed. In this setup, animals are only illuminated with infrared light at the very specific position of contact with the underlying crawling surface. This methodology results in very high contrast images. Subsequently, these high contrast images are processed using established contour tracking algorithms. Based on this, we developed the FIMTrack software, which serves to extract a number of features needed to quantitatively describe a large variety of locomotion characteristics. During the development of this software package, we focused our efforts on an open source architecture allowing the easy addition of further modules. The program operates platform independent and is accompanied by an intuitive GUI guiding the user through data analysis. All locomotion parameter values are given in form of csv files allowing further data analyses. In addition, a Results Viewer integrated into the tracking software provides the opportunity to interactively review and adjust the output, as might be needed during stimulus integration. The power of FIM and FIMTrack is demonstrated by studying the locomotion of Drosophila larvae.

  2. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    2009-12-01

    Full Text Available Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges.We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences.Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is superior to TU-counting plus Sanger

  3. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  4. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  5. High-throughput determination of vancomycin in human plasma by a cost-effective system of two-dimensional liquid chromatography.

    Science.gov (United States)

    Sheng, Yanghao; Zhou, Boting

    2017-05-26

    Therapeutic drug monitoring (TDM) is one of the most important services of clinical laboratories. Two main techniques are commonly used: the immunoassay and chromatography method. We have developed a cost-effective system of two-dimensional liquid chromatography with ultraviolet detection (2D-LC-UV) for high-throughput determination of vancomycin in human plasma that combines the automation and low start-up costs of the immunoassay with the high selectivity and sensitivity of the liquid chromatography coupled with mass spectrometric detection without incurring their disadvantages, achieving high cost-effectiveness. This 2D-LC system offers a large volume injection to provide sufficient sensitivity and uses simulated gradient peak compression technology to control peak broadening and to improve peak shape. A middle column was added to reduce the analysis cycle time and make it suitable for high-throughput routine clinical assays. The analysis cycle time was 4min and the peak width was 0.8min. Compared with other chromatographic methods that have been developed, the analysis cycle time and peak width for vancomycin was reduced significantly. The lower limit of quantification was 0.20μg/mL for vancomycin, which is the same as certain LC-MS/MS methods that have been recently developed and validated. The method is rapid, automated, and low-cost and has high selectivity and sensitivity for the quantification of vancomycin in human plasma, thus making it well-suited for use in hospital clinical laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  7. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  8. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  9. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  10. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Directory of Open Access Journals (Sweden)

    Soichi Inagaki

    Full Text Available Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  11. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Science.gov (United States)

    Inagaki, Soichi; Henry, Isabelle M; Lieberman, Meric C; Comai, Luca

    2015-01-01

    Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  12. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  13. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  14. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  15. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-01-01

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  16. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  17. High Throughput Analysis of Breast Cancer Specimens on the Grid

    OpenAIRE

    Yang, Lin; Chen, Wenjin; Meer, Peter; Salaru, Gratian; Feldman, Michael D.; Foran, David J.

    2007-01-01

    Breast cancer accounts for about 30% of all cancers and 15% of all cancer deaths in women in the United States. Advances in computer assisted diagnosis (CAD) holds promise for early detecting and staging disease progression. In this paper we introduce a Grid-enabled CAD to perform automatic analysis of imaged histopathology breast tissue specimens. More than 100,000 digitized samples (1200 × 1200 pixels) have already been processed on the Grid. We have analyzed results for 3744 breast tissue ...

  18. Regulatory pathway analysis by high-throughput in situ hybridization.

    Directory of Open Access Journals (Sweden)

    Axel Visel

    2007-10-01

    Full Text Available Automated in situ hybridization enables the construction of comprehensive atlases of gene expression patterns in mammals. Such atlases can become Web-searchable digital expression maps of individual genes and thus offer an entryway to elucidate genetic interactions and signaling pathways. Towards this end, an atlas housing approximately 1,000 spatial gene expression patterns of the midgestation mouse embryo was generated. Patterns were textually annotated using a controlled vocabulary comprising >90 anatomical features. Hierarchical clustering of annotations was carried out using distance scores calculated from the similarity between pairs of patterns across all anatomical structures. This process ordered hundreds of complex expression patterns into a matrix that reflects the embryonic architecture and the relatedness of patterns of expression. Clustering yielded 12 distinct groups of expression patterns. Because of the similarity of expression patterns within a group, members of each group may be components of regulatory cascades. We focused on the group containing Pax6, an evolutionary conserved transcriptional master mediator of development. Seventeen of the 82 genes in this group showed a change of expression in the developing neocortex of Pax6-deficient embryos. Electromobility shift assays were used to test for the presence of Pax6-paired domain binding sites. This led to the identification of 12 genes not previously known as potential targets of Pax6 regulation. These findings suggest that cluster analysis of annotated gene expression patterns obtained by automated in situ hybridization is a novel approach for identifying components of signaling cascades.

  19. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  20. Genetic high throughput screening in Retinitis Pigmentosa based on high resolution melting (HRM) analysis.

    Science.gov (United States)

    Anasagasti, Ander; Barandika, Olatz; Irigoyen, Cristina; Benitez, Bruno A; Cooper, Breanna; Cruchaga, Carlos; López de Munain, Adolfo; Ruiz-Ederra, Javier

    2013-11-01

    Retinitis Pigmentosa (RP) involves a group of genetically determined retinal diseases caused by a large number of mutations that result in rod photoreceptor cell death followed by gradual death of cone cells. Most cases of RP are monogenic, with more than 80 associated genes identified so far. The high number of genes and variants involved in RP, among other factors, is making the molecular characterization of RP a real challenge for many patients. Although HRM has been used for the analysis of isolated variants or single RP genes, as far as we are concerned, this is the first study that uses HRM analysis for a high-throughput screening of several RP genes. Our main goal was to test the suitability of HRM analysis as a genetic screening technique in RP, and to compare its performance with two of the most widely used NGS platforms, Illumina and PGM-Ion Torrent technologies. RP patients (n = 96) were clinically diagnosed at the Ophthalmology Department of Donostia University Hospital, Spain. We analyzed a total of 16 RP genes that meet the following inclusion criteria: 1) size: genes with transcripts of less than 4 kb; 2) number of exons: genes with up to 22 exons; and 3) prevalence: genes reported to account for, at least, 0.4% of total RP cases worldwide. For comparison purposes, RHO gene was also sequenced with Illumina (GAII; Illumina), Ion semiconductor technologies (PGM; Life Technologies) and Sanger sequencing (ABI 3130xl platform; Applied Biosystems). Detected variants were confirmed in all cases by Sanger sequencing and tested for co-segregation in the family of affected probands. We identified a total of 65 genetic variants, 15 of which (23%) were novel, in 49 out of 96 patients. Among them, 14 (4 novel) are probable disease-causing genetic variants in 7 RP genes, affecting 15 patients. Our HRM analysis-based study, proved to be a cost-effective and rapid method that provides an accurate identification of genetic RP variants. This approach is effective for

  1. Targeted DNA Methylation Analysis by High Throughput Sequencing in Porcine Peri-attachment Embryos

    OpenAIRE

    MORRILL, Benson H.; COX, Lindsay; WARD, Anika; HEYWOOD, Sierra; PRATHER, Randall S.; ISOM, S. Clay

    2013-01-01

    Abstract The purpose of this experiment was to implement and evaluate the effectiveness of a next-generation sequencing-based method for DNA methylation analysis in porcine embryonic samples. Fourteen discrete genomic regions were amplified by PCR using bisulfite-converted genomic DNA derived from day 14 in vivo-derived (IVV) and parthenogenetic (PA) porcine embryos as template DNA. Resulting PCR products were subjected to high-throughput sequencing using the Illumina Genome Analyzer IIx plat...

  2. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  3. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  4. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  5. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  6. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  7. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  8. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  9. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  10. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  11. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  12. Cost effectiveness analysis in radiopharmacy

    International Nuclear Information System (INIS)

    Carpentier, N.; Verbeke, S.; Ducloux, T.

    1999-01-01

    Objective: to evaluate the cost effectiveness of radiopharmaceuticals and their quality control. Materials and methods: this retrospective study was made in the Nuclear Medicine Department of the University Hospital of Limoges. Radiopharmaceutical costs were obtained with adding the price of the radiotracer, the materials, the equipments, the labour, the running expenses and the radioisotope. The costs of quality control were obtained with adding the price of labour, materials, equipments, running expenses and the cost of the quality control of 99m Tc eluate. Results: during 1998, 2106 radiopharmaceuticals were prepared in the Nuclear Medicine Department. The mean cost effectiveness of radiopharmaceutical was 1430 francs (846 to 4260). The mean cost effectiveness of quality control was 163 francs (84 to 343). The rise of the radiopharmaceutical cost induced by quality control was 11%. Conclusion: the technical methodology of quality control must be mastered to optimize the cost of this operation. (author)

  13. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  14. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  15. Improvements and impacts of GRCh38 human reference on high throughput sequencing data analysis.

    Science.gov (United States)

    Guo, Yan; Dai, Yulin; Yu, Hui; Zhao, Shilin; Samuels, David C; Shyr, Yu

    2017-03-01

    Analyses of high throughput sequencing data starts with alignment against a reference genome, which is the foundation for all re-sequencing data analyses. Each new release of the human reference genome has been augmented with improved accuracy and completeness. It is presumed that the latest release of human reference genome, GRCh38 will contribute more to high throughput sequencing data analysis by providing more accuracy. But the amount of improvement has not yet been quantified. We conducted a study to compare the genomic analysis results between the GRCh38 reference and its predecessor GRCh37. Through analyses of alignment, single nucleotide polymorphisms, small insertion/deletions, copy number and structural variants, we show that GRCh38 offers overall more accurate analysis of human sequencing data. More importantly, GRCh38 produced fewer false positive structural variants. In conclusion, GRCh38 is an improvement over GRCh37 not only from the genome assembly aspect, but also yields more reliable genomic analysis results. Copyright © 2017. Published by Elsevier Inc.

  16. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  17. High-throughput STR analysis for DNA database using direct PCR.

    Science.gov (United States)

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to <3%. In conclusion, the newly developed direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner. © 2013 American Academy of Forensic Sciences Published 2013. This article is a U.S. Government work and is in the public domain in the U.S.A.

  18. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    Science.gov (United States)

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  19. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  20. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    Science.gov (United States)

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  2. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  3. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules.

    Directory of Open Access Journals (Sweden)

    Antonino Ingargiola

    Full Text Available We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions.

  4. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    Science.gov (United States)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  5. Data reduction for a high-throughput neutron activation analysis system

    International Nuclear Information System (INIS)

    Bowman, W.W.

    1979-01-01

    To analyze samples collected as part of a geochemical survey for the National Uranium Resource Evaluation program, Savannah River Laboratory has installed a high-throughput neutron activation analysis system. As part of that system, computer programs have been developed to reduce raw data to elemental concentrations in two steps. Program RAGS reduces gamma-ray spectra to lists of photopeak energies, peak areas, and statistical errors. Program RICHES determines the elemental concentrations from photopeak and delayed-neutron data, detector efficiencies, analysis parameters (neutron flux and activation, decay, and counting times), and spectrometric and cross-section data from libraries. Both programs have been streamlined for on-line operation with a minicomputer, each requiring approx. 64 kbytes of core. 3 tables

  6. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  7. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  8. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  9. High-throughput analysis of amino acids in plant materials by single quadrupole mass spectrometry

    DEFF Research Database (Denmark)

    Dahl-Lassen, Rasmus; van Hecke, Jan Julien Josef; Jørgensen, Henning

    2018-01-01

    that it is very time consuming with typical chromatographic run times of 70 min or more. Results: We have here developed a high-throughput method for analysis of amino acid profiles in plant materials. The method combines classical protein hydrolysis and derivatization with fast separation by UHPLC and detection...... reducing the overall analytical costs compared to methods based on more advanced mass spectrometers....... by a single quadrupole (QDa) mass spectrometer. The chromatographic run time is reduced to 10 min and the precision, accuracy and sensitivity of the method are in line with other recent methods utilizing advanced and more expensive mass spectrometers. The sensitivity of the method is at least a factor 10...

  10. GUItars: a GUI tool for analysis of high-throughput RNA interference screening data.

    Directory of Open Access Journals (Sweden)

    Asli N Goktug

    Full Text Available High-throughput RNA interference (RNAi screening has become a widely used approach to elucidating gene functions. However, analysis and annotation of large data sets generated from these screens has been a challenge for researchers without a programming background. Over the years, numerous data analysis methods were produced for plate quality control and hit selection and implemented by a few open-access software packages. Recently, strictly standardized mean difference (SSMD has become a widely used method for RNAi screening analysis mainly due to its better control of false negative and false positive rates and its ability to quantify RNAi effects with a statistical basis. We have developed GUItars to enable researchers without a programming background to use SSMD as both a plate quality and a hit selection metric to analyze large data sets.The software is accompanied by an intuitive graphical user interface for easy and rapid analysis workflow. SSMD analysis methods have been provided to the users along with traditionally-used z-score, normalized percent activity, and t-test methods for hit selection. GUItars is capable of analyzing large-scale data sets from screens with or without replicates. The software is designed to automatically generate and save numerous graphical outputs known to be among the most informative high-throughput data visualization tools capturing plate-wise and screen-wise performances. Graphical outputs are also written in HTML format for easy access, and a comprehensive summary of screening results is written into tab-delimited output files.With GUItars, we demonstrated robust SSMD-based analysis workflow on a 3840-gene small interfering RNA (siRNA library and identified 200 siRNAs that increased and 150 siRNAs that decreased the assay activities with moderate to stronger effects. GUItars enables rapid analysis and illustration of data from large- or small-scale RNAi screens using SSMD and other traditional analysis

  11. Green Infrastructure Siting and Cost Effectiveness Analysis

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Parcel scale green infrastructure siting and cost effectiveness analysis. You can find more details at the project's website.

  12. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  13. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina Lundgaard; Login, Frédéric H.; Jensen, Helene Halkjær

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacteria...

  14. Cost benefit analysis cost effectiveness analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data

  15. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists. © 2016 by The Author(s) 2016.

  16. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    Science.gov (United States)

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  17. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    Science.gov (United States)

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  18. High-Throughput Quantitative Proteomic Analysis of Dengue Virus Type 2 Infected A549 Cells

    Science.gov (United States)

    Chiu, Han-Chen; Hannemann, Holger; Heesom, Kate J.; Matthews, David A.; Davidson, Andrew D.

    2014-01-01

    Disease caused by dengue virus is a global health concern with up to 390 million individuals infected annually worldwide. There are no vaccines or antiviral compounds available to either prevent or treat dengue disease which may be fatal. To increase our understanding of the interaction of dengue virus with the host cell, we analyzed changes in the proteome of human A549 cells in response to dengue virus type 2 infection using stable isotope labelling in cell culture (SILAC) in combination with high-throughput mass spectrometry (MS). Mock and infected A549 cells were fractionated into nuclear and cytoplasmic extracts before analysis to identify proteins that redistribute between cellular compartments during infection and reduce the complexity of the analysis. We identified and quantified 3098 and 2115 proteins in the cytoplasmic and nuclear fractions respectively. Proteins that showed a significant alteration in amount during infection were examined using gene enrichment, pathway and network analysis tools. The analyses revealed that dengue virus infection modulated the amounts of proteins involved in the interferon and unfolded protein responses, lipid metabolism and the cell cycle. The SILAC-MS results were validated for a select number of proteins over a time course of infection by Western blotting and immunofluorescence microscopy. Our study demonstrates for the first time the power of SILAC-MS for identifying and quantifying novel changes in cellular protein amounts in response to dengue virus infection. PMID:24671231

  19. High-throughput quantitative proteomic analysis of dengue virus type 2 infected A549 cells.

    Directory of Open Access Journals (Sweden)

    Han-Chen Chiu

    Full Text Available Disease caused by dengue virus is a global health concern with up to 390 million individuals infected annually worldwide. There are no vaccines or antiviral compounds available to either prevent or treat dengue disease which may be fatal. To increase our understanding of the interaction of dengue virus with the host cell, we analyzed changes in the proteome of human A549 cells in response to dengue virus type 2 infection using stable isotope labelling in cell culture (SILAC in combination with high-throughput mass spectrometry (MS. Mock and infected A549 cells were fractionated into nuclear and cytoplasmic extracts before analysis to identify proteins that redistribute between cellular compartments during infection and reduce the complexity of the analysis. We identified and quantified 3098 and 2115 proteins in the cytoplasmic and nuclear fractions respectively. Proteins that showed a significant alteration in amount during infection were examined using gene enrichment, pathway and network analysis tools. The analyses revealed that dengue virus infection modulated the amounts of proteins involved in the interferon and unfolded protein responses, lipid metabolism and the cell cycle. The SILAC-MS results were validated for a select number of proteins over a time course of infection by Western blotting and immunofluorescence microscopy. Our study demonstrates for the first time the power of SILAC-MS for identifying and quantifying novel changes in cellular protein amounts in response to dengue virus infection.

  20. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis.

    Science.gov (United States)

    Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren

    2016-11-01

    Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is

  1. A high throughput mass spectrometry screening analysis based on two-dimensional carbon microfiber fractionation system.

    Science.gov (United States)

    Ma, Biao; Zou, Yilin; Xie, Xuan; Zhao, Jinhua; Piao, Xiangfan; Piao, Jingyi; Yao, Zhongping; Quinto, Maurizio; Wang, Gang; Li, Donghao

    2017-06-09

    A novel high-throughput, solvent saving and versatile integrated two-dimensional microscale carbon fiber/active carbon fiber system (2DμCFs) that allows a simply and rapid separation of compounds in low-polar, medium-polar and high-polar fractions, has been coupled with ambient ionization-mass spectrometry (ESI-Q-TOF-MS and ESI-QqQ-MS) for screening and quantitative analyses of real samples. 2DμCFs led to a substantial interference reduction and minimization of ionization suppression effects, thus increasing the sensitivity and the screening capabilities of the subsequent MS analysis. The method has been applied to the analysis of Schisandra Chinensis extracts, obtaining with a single injection a simultaneous determination of 33 compounds presenting different polarities, such as organic acids, lignans, and flavonoids in less than 7min, at low pressures and using small solvent amounts. The method was also validated using 10 model compounds, giving limit of detections (LODs) ranging from 0.3 to 30ngmL -1 , satisfactory recoveries (from 75.8 to 93.2%) and reproducibilities (relative standard deviations, RSDs, from 1.40 to 8.06%). Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  3. High-throughput peptide mass fingerprinting and protein macroarray analysis using chemical printing strategies

    International Nuclear Information System (INIS)

    Sloane, A.J.; Duff, J.L.; Hopwood, F.G.; Wilson, N.L.; Smith, P.E.; Hill, C.J.; Packer, N.H.; Williams, K.L.; Gooley, A.A.; Cole, R.A.; Cooley, P.W.; Wallace, D.B.

    2001-01-01

    We describe a 'chemical printer' that uses piezoelectric pulsing for rapid and accurate microdispensing of picolitre volumes of fluid for proteomic analysis of 'protein macroarrays'. Unlike positive transfer and pin transfer systems, our printer dispenses fluid in a non-contact process that ensures that the fluid source cannot be contaminated by substrate during a printing event. We demonstrate automated delivery of enzyme and matrix solutions for on-membrane protein digestion and subsequent peptide mass fingerprinting (pmf) analysis directly from the membrane surface using matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). This approach bypasses the more commonly used multi-step procedures, thereby permitting a more rapid procedure for protein identification. We also highlight the advantage of printing different chemistries onto an individual protein spot for multiple microscale analyses. This ability is particularly useful when detailed characterisation of rare and valuable sample is required. Using a combination of PNGase F and trypsin we have mapped sites of N-glycosylation using on-membrane digestion strategies. We also demonstrate the ability to print multiple serum samples in a micro-ELISA format and rapidly screen a protein macroarray of human blood plasma for pathogen-derived antigens. We anticipate that the 'chemical printer' will be a major component of proteomic platforms for high-throughput protein identification and characterisation with widespread applications in biomedical and diagnostic discovery

  4. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  5. MultiSense: A Multimodal Sensor Tool Enabling the High-Throughput Analysis of Respiration.

    Science.gov (United States)

    Keil, Peter; Liebsch, Gregor; Borisjuk, Ljudmilla; Rolletschek, Hardy

    2017-01-01

    The high-throughput analysis of respiratory activity has become an important component of many biological investigations. Here, a technological platform, denoted the "MultiSense tool," is described. The tool enables the parallel monitoring of respiration in 100 samples over an extended time period, by dynamically tracking the concentrations of oxygen (O 2 ) and/or carbon dioxide (CO 2 ) and/or pH within an airtight vial. Its flexible design supports the quantification of respiration based on either oxygen consumption or carbon dioxide release, thereby allowing for the determination of the physiologically significant respiratory quotient (the ratio between the quantities of CO 2 released and the O 2 consumed). It requires an LED light source to be mounted above the sample, together with a CCD camera system, adjusted to enable the capture of analyte-specific wavelengths, and fluorescent sensor spots inserted into the sample vial. Here, a demonstration is given of the use of the MultiSense tool to quantify respiration in imbibing plant seeds, for which an appropriate step-by-step protocol is provided. The technology can be easily adapted for a wide range of applications, including the monitoring of gas exchange in any kind of liquid culture system (algae, embryo and tissue culture, cell suspensions, microbial cultures).

  6. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  7. A Data Analysis Pipeline Accounting for Artifacts in Tox21 Quantitative High-Throughput Screening Assays.

    Science.gov (United States)

    Hsieh, Jui-Hua; Sedykh, Alexander; Huang, Ruili; Xia, Menghang; Tice, Raymond R

    2015-08-01

    A main goal of the U.S. Tox21 program is to profile a 10K-compound library for activity against a panel of stress-related and nuclear receptor signaling pathway assays using a quantitative high-throughput screening (qHTS) approach. However, assay artifacts, including nonreproducible signals and assay interference (e.g., autofluorescence), complicate compound activity interpretation. To address these issues, we have developed a data analysis pipeline that includes an updated signal noise-filtering/curation protocol and an assay interference flagging system. To better characterize various types of signals, we adopted a weighted version of the area under the curve (wAUC) to quantify the amount of activity across the tested concentration range in combination with the assay-dependent point-of-departure (POD) concentration. Based on the 32 Tox21 qHTS assays analyzed, we demonstrate that signal profiling using wAUC affords the best reproducibility (Pearson's r = 0.91) in comparison with the POD (0.82) only or the AC(50) (i.e., half-maximal activity concentration, 0.81). Among the activity artifacts characterized, cytotoxicity is the major confounding factor; on average, about 8% of Tox21 compounds are affected, whereas autofluorescence affects less than 0.5%. To facilitate data evaluation, we implemented two graphical user interface applications, allowing users to rapidly evaluate the in vitro activity of Tox21 compounds. © 2015 Society for Laboratory Automation and Screening.

  8. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  9. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    Science.gov (United States)

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  10. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  11. High-throughput genetic analysis in a cohort of patients with Ocular Developmental Anomalies

    Directory of Open Access Journals (Sweden)

    Suganya Kandeeban

    2017-10-01

    Full Text Available Anophthalmia and microphthalmia (A/M are developmental ocular malformations in which the eye fails to form or is smaller than normal with both genetic and environmental etiology. Microphthalmia is often associated with additional ocular anomalies, most commonly coloboma or cataract [1, 2]. A/M has a combined incidence between 1-3.2 cases per 10,000 live births in Caucasians [3, 4]. The spectrum of genetic abnormalities (chromosomal and molecular associated with these ocular developmental defects are being investigated in the current study. A detailed pedigree analysis and ophthalmic examination have been documented for the enrolled patients followed by blood collection and DNA extraction. The strategies for genetic analysis included chromosomal analysis by conventional and array based (affymetrix cytoscan HD array methods, targeted re-sequencing of the candidate genes and whole exome sequencing (WES in Illumina HiSEQ 2500. WES was done in families excluded for mutations in candidate genes. Twenty four samples (Microphthalmia (M-5, Anophthalmia (A-7,Coloboma-2, M&A-1, microphthalmia and coloboma / other ocular features-9 were initially analyzed using conventional Geimsa Trypsin Geimsa banding of which 4 samples revealed gross chromosomal aberrations (deletions in 3q26.3-28, 11p13 (N=2 and 11q23 regions. Targeted re sequencing of candidate genes showed mutations in CHX10, PAX6, FOXE3, ABCB6 and SHH genes in 6 samples. High throughput array based chromosomal analysis revealed aberrations in 4 samples (17q21dup (n=2, 8p11del (n=2. Overall, genetic alterations in known candidate genes are seen in 50% of the study subjects. Whole exome sequencing was performed in samples that were excluded for mutations in candidate genes and the results are discussed.

  12. GiA Roots: software for the high throughput analysis of plant root system architecture

    Science.gov (United States)

    2012-01-01

    Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. Conclusions We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis. PMID:22834569

  13. Cyber-T web server: differential analysis of high-throughput data.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-07-01

    The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.

  14. WormSizer: high-throughput analysis of nematode size and shape.

    Directory of Open Access Journals (Sweden)

    Brad T Moore

    Full Text Available The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy mutant is short and fat and that a Long (Lon mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc and Roller (Rol mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.

  15. Ontology-based meta-analysis of global collections of high-throughput public data.

    Directory of Open Access Journals (Sweden)

    Ilya Kupershmidt

    2010-09-01

    Full Text Available The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today.We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets.Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  16. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis

    Directory of Open Access Journals (Sweden)

    Yushen Du

    2016-11-01

    Full Text Available Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp, we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available.

  17. ZebraZoom: an automated program for high-throughput behavioral analysis and categorization

    Science.gov (United States)

    Mirat, Olivier; Sternberg, Jenna R.; Severi, Kristen E.; Wyart, Claire

    2013-01-01

    The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorized all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed with four experimenters in 73.2–82.5% of cases. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva–larva interactions occurred as series of escapes. Overall, ZebraZoom reached the level of precision found in manual analysis but accomplished tasks in a high-throughput format necessary for large screens. PMID:23781175

  18. Ontology-based meta-analysis of global collections of high-throughput public data.

    Science.gov (United States)

    Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa

    2010-09-29

    The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  19. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  20. Improvement in the sensitivity of newborn screening for Fabry disease among females through the use of a high-throughput and cost-effective method, DNA mass spectrometry.

    Science.gov (United States)

    Lu, Yung-Hsiu; Huang, Po-Hsun; Wang, Li-Yun; Hsu, Ting-Rong; Li, Hsing-Yuan; Lee, Pi-Chang; Hsieh, Yu-Ping; Hung, Sheng-Che; Wang, Yu-Chen; Chang, Sheng-Kai; Lee, Ya-Ting; Ho, Ping-Hsun; Ho, Hui-Chen; Niu, Dau-Ming

    2018-01-01

    Many female carriers of Fabry disease are likely to develop severe morbidity and mortality. However, by our own estimation, around 80% of female newborns are missed by our current enzyme-based screening approach. Our team's aim was to develop an improved cost-effective screening method that is able to detect Fabry disease among female newborns. In Taiwan, based on a database of 916,000 newborns, ~98% of Fabry patients carry mutations out of a pool of only 21 pathogenic mutations. An Agena iPLEX platform was designed to detect these 21 pathogenic mutations using only a single-assay panel. A total of 54,791 female infants were screened and 136 female newborns with the IVS4 + 919G > A mutation and one female newborn with the c.656T > C mutation were identified. Using the current enzyme-based newborn screening approach as baseline, around 83% of female newborns are being missed. Through a family study of the IVS4 female newborns, 30 IVS4 adult family members were found to have left ventricular hypertrophy. Ten patients received endomyocardial biopsy and all were found to have significant globotriaosylceramide (Gb3) accumulation in their cardiomyocytes. All of these individuals now receive enzyme replacement therapy. We have demonstrated that the Agena iPLEX assay is a powerful tool for detecting females with Fabry disease. Furthermore, through this screening, we also have been able to identify many disease-onset adult family members who were originally undiagnosed for Fabry disease. This screening helps them to receive treatment in time before severe and irreversible cardiac damage has occurred.

  1. High-throughput and automated diagnosis of antimicrobial resistance using a cost-effective cellphone-based micro-plate reader

    Science.gov (United States)

    Feng, Steve; Tseng, Derek; di Carlo, Dino; Garner, Omai B.; Ozcan, Aydogan

    2016-12-01

    Routine antimicrobial susceptibility testing (AST) can prevent deaths due to bacteria and reduce the spread of multi-drug-resistance, but cannot be regularly performed in resource-limited-settings due to technological challenges, high-costs, and lack of trained professionals. We demonstrate an automated and cost-effective cellphone-based 96-well microtiter-plate (MTP) reader, capable of performing AST without the need for trained diagnosticians. Our system includes a 3D-printed smartphone attachment that holds and illuminates the MTP using a light-emitting-diode array. An inexpensive optical fiber-array enables the capture of the transmitted light of each well through the smartphone camera. A custom-designed application sends the captured image to a server to automatically determine well-turbidity, with results returned to the smartphone in ~1 minute. We tested this mobile-reader using MTPs prepared with 17 antibiotics targeting Gram-negative bacteria on clinical isolates of Klebsiella pneumoniae, containing highly-resistant antimicrobial profiles. Using 78 patient isolate test-plates, we demonstrated that our mobile-reader meets the FDA-defined AST criteria, with a well-turbidity detection accuracy of 98.21%, minimum-inhibitory-concentration accuracy of 95.12%, and a drug-susceptibility interpretation accuracy of 99.23%, with no very major errors. This mobile-reader could eliminate the need for trained diagnosticians to perform AST, reduce the cost-barrier for routine testing, and assist in spatio-temporal tracking of bacterial resistance.

  2. Development of high-throughput analysis system using highly-functional organic polymer monoliths

    International Nuclear Information System (INIS)

    Umemura, Tomonari; Kojima, Norihisa; Ueki, Yuji

    2008-01-01

    The growing demand for high-throughput analysis in the current competitive life sciences and industries has promoted the development of high-speed HPLC techniques and tools. As one of such tools, monolithic columns have attracted increasing attention and interest in the last decade due to the low flow-resistance and excellent mass transfer, allowing for rapid separations and reactions at high flow rates with minimal loss of column efficiency. Monolithic materials are classified into two main groups: silica- and organic polymer-based monoliths, each with their own advantages and disadvantages. Organic polymer monoliths have several distinct advantages in life-science research, including wide pH stability, less irreversible adsorption, facile preparation and modification. Thus, we have so far tried to develop organic polymer monoliths for various chemical operations, such as separation, extraction, preconcentration, and reaction. In the present paper, recent progress in the development of organic polymer monoliths is discussed. Especially, the procedure for the preparation of methacrylate-based monoliths with various functional groups is described, where the influence of different compositional and processing parameters on the monolithic structure is also addressed. Furthermore, the performance of the produced monoliths is demonstrated through the results for (1) rapid separations of alklybenzenes at high flow rates, (2) flow-through enzymatic digestion of cytochrome c on a trypsin-immobilized monolithic column, and (3) separation of the tryptic digest on a reversed-phase monolithic column. The flexibility and versatility of organic polymer monoliths will be beneficial for further enhancing analytical performance, and will open the way for new applications and opportunities both in scientific and industrial research. (author)

  3. Identification of microRNAs from Eugenia uniflora by high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Guzman, Frank; Almerão, Mauricio P; Körbes, Ana P; Loss-Morais, Guilherme; Margis, Rogerio

    2012-01-01

    microRNAs or miRNAs are small non-coding regulatory RNAs that play important functions in the regulation of gene expression at the post-transcriptional level by targeting mRNAs for degradation or inhibiting protein translation. Eugenia uniflora is a plant native to tropical America with pharmacological and ecological importance, and there have been no previous studies concerning its gene expression and regulation. To date, no miRNAs have been reported in Myrtaceae species. Small RNA and RNA-seq libraries were constructed to identify miRNAs and pre-miRNAs in Eugenia uniflora. Solexa technology was used to perform high throughput sequencing of the library, and the data obtained were analyzed using bioinformatics tools. From 14,489,131 small RNA clean reads, we obtained 1,852,722 mature miRNA sequences representing 45 conserved families that have been identified in other plant species. Further analysis using contigs assembled from RNA-seq allowed the prediction of secondary structures of 25 known and 17 novel pre-miRNAs. The expression of twenty-seven identified miRNAs was also validated using RT-PCR assays. Potential targets were predicted for the most abundant mature miRNAs in the identified pre-miRNAs based on sequence homology. This study is the first large scale identification of miRNAs and their potential targets from a species of the Myrtaceae family without genomic sequence resources. Our study provides more information about the evolutionary conservation of the regulatory network of miRNAs in plants and highlights species-specific miRNAs.

  4. High-throughput simultaneous analysis of RNA, protein, and lipid biomarkers in heterogeneous tissue samples.

    Science.gov (United States)

    Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S

    2011-11-01

    With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.

  5. Making choices in health: WHO guide to cost effectiveness analysis

    National Research Council Canada - National Science Library

    Tan Torres Edejer, Tessa

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . XXI PART ONE: METHODS COST-EFFECTIVENESS FOR GENERALIZED ANALYSIS 1. 2. What is Generalized Cost-Effectiveness Analysis? . . . . . . . . . . . . 3 Undertaking...

  6. Cost-effectiveness analysis and innovation.

    Science.gov (United States)

    Jena, Anupam B; Philipson, Tomas J

    2008-09-01

    While cost-effectiveness (CE) analysis has provided a guide to allocating often scarce resources spent on medical technologies, less emphasis has been placed on the effect of such criteria on the behavior of innovators who make health care technologies available in the first place. A better understanding of the link between innovation and cost-effectiveness analysis is particularly important given the large role of technological change in the growth in health care spending and the growing interest of explicit use of CE thresholds in leading technology adoption in several Westernized countries. We analyze CE analysis in a standard market context, and stress that a technology's cost-effectiveness is closely related to the consumer surplus it generates. Improved CE therefore often clashes with interventions to stimulate producer surplus, such as patents. We derive the inconsistency between technology adoption based on CE analysis and economic efficiency. Indeed, static efficiency, dynamic efficiency, and improved patient health may all be induced by the cost-effectiveness of the technology being at its worst level. As producer appropriation of the social surplus of an innovation is central to the dynamic efficiency that should guide CE adoption criteria, we exemplify how appropriation can be inferred from existing CE estimates. For an illustrative sample of technologies considered, we find that the median technology has an appropriation of about 15%. To the extent that such incentives are deemed either too low or too high compared to dynamically efficient levels, CE thresholds may be appropriately raised or lowered to improve dynamic efficiency.

  7. Fine grained compositional analysis of Port Everglades Inlet microbiome using high throughput DNA sequencing.

    Science.gov (United States)

    O'Connell, Lauren; Gao, Song; McCorquodale, Donald; Fleisher, Jay; Lopez, Jose V

    2018-01-01

    Similar to natural rivers, manmade inlets connect inland runoff to the ocean. Port Everglades Inlet (PEI) is a busy cargo and cruise ship port in South Florida, which can act as a source of pollution to surrounding beaches and offshore coral reefs. Understanding the composition and fluctuations of bacterioplankton communities ("microbiomes") in major port inlets is important due to potential impacts on surrounding environments. We hypothesize seasonal microbial fluctuations, which were profiled by high throughput 16S rRNA amplicon sequencing and analysis. Surface water samples were collected every week for one year. A total of four samples per month, two from each sampling location, were used for statistical analysis creating a high sampling frequency and finer sampling scale than previous inlet microbiome studies. We observed significant differences in community alpha diversity between months and seasons. Analysis of composition of microbiomes (ANCOM) tests were run in QIIME 2 at genus level taxonomic classification to determine which genera were differentially abundant between seasons and months. Beta diversity results yielded significant differences in PEI community composition in regard to month, season, water temperature, and salinity. Analysis of potentially pathogenic genera showed presence of Staphylococcus and Streptococcus . However, statistical analysis indicated that these organisms were not present in significantly high abundances throughout the year or between seasons. Significant differences in alpha diversity were observed when comparing microbial communities with respect to time. This observation stems from the high community evenness and low community richness in August. This indicates that only a few organisms dominated the community during this month. August had lower than average rainfall levels for a wet season, which may have contributed to less runoff, and fewer bacterial groups introduced into the port surface waters. Bacterioplankton beta

  8. Fine grained compositional analysis of Port Everglades Inlet microbiome using high throughput DNA sequencing

    Directory of Open Access Journals (Sweden)

    Lauren O’Connell

    2018-05-01

    Full Text Available Background Similar to natural rivers, manmade inlets connect inland runoff to the ocean. Port Everglades Inlet (PEI is a busy cargo and cruise ship port in South Florida, which can act as a source of pollution to surrounding beaches and offshore coral reefs. Understanding the composition and fluctuations of bacterioplankton communities (“microbiomes” in major port inlets is important due to potential impacts on surrounding environments. We hypothesize seasonal microbial fluctuations, which were profiled by high throughput 16S rRNA amplicon sequencing and analysis. Methods & Results Surface water samples were collected every week for one year. A total of four samples per month, two from each sampling location, were used for statistical analysis creating a high sampling frequency and finer sampling scale than previous inlet microbiome studies. We observed significant differences in community alpha diversity between months and seasons. Analysis of composition of microbiomes (ANCOM tests were run in QIIME 2 at genus level taxonomic classification to determine which genera were differentially abundant between seasons and months. Beta diversity results yielded significant differences in PEI community composition in regard to month, season, water temperature, and salinity. Analysis of potentially pathogenic genera showed presence of Staphylococcus and Streptococcus. However, statistical analysis indicated that these organisms were not present in significantly high abundances throughout the year or between seasons. Discussion Significant differences in alpha diversity were observed when comparing microbial communities with respect to time. This observation stems from the high community evenness and low community richness in August. This indicates that only a few organisms dominated the community during this month. August had lower than average rainfall levels for a wet season, which may have contributed to less runoff, and fewer bacterial groups

  9. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    Directory of Open Access Journals (Sweden)

    Husted Søren

    2009-09-01

    Full Text Available Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. Results We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight. A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds, the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm closely matched the total content obtained by analysis of the whole rice grain. Conclusion A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at

  10. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  11. Gold-coated polydimethylsiloxane microwells for high-throughput electrochemiluminescence analysis of intracellular glucose at single cells.

    Science.gov (United States)

    Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng

    2018-06-04

    In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.

  12. A Cost-Effective High-Throughput Plasma and Serum Proteomics Workflow Enables Mapping of the Molecular Impact of Total Pancreatectomy with Islet Autotransplantation

    DEFF Research Database (Denmark)

    Bennike, Tue Bjerg; Bellin, Melena D.; Xuan, Yue

    2018-01-01

    Blood is an ideal body fluid for the discovery or monitoring of diagnostic and prognostic protein biomarkers. However, discovering robust biomarkers requires the analysis of large numbers of samples to appropriately represent interindividual variability. To address this analytical challenge, we es...

  13. High-Throughput Fabrication of Nanocone Substrates through Polymer Injection Moulding For SERS Analysis in Microfluidic Systems

    DEFF Research Database (Denmark)

    Viehrig, Marlitt; Matteucci, Marco; Thilsted, Anil H.

    analysis. Metal-capped silicon nanopillars, fabricated through a maskless ion etch, are state-of-the-art for on-chip SERS substrates. A dense cluster of high aspect ratio polymer nanocones was achieved by using high-throughput polymer injection moulding over a large area replicating a silicon nanopillar...... structure. Gold-capped polymer nanocones display similar SERS sensitivity as silicon nanopillars, while being easily integrable into a microfluidic chips....

  14. Cost-effectiveness Analysis for Technology Acquisition.

    Science.gov (United States)

    Chakravarty, A; Naware, S S

    2008-01-01

    In a developing country with limited resources, it is important to utilize the total cost visibility approach over the entire life-cycle of the technology and then analyse alternative options for acquiring technology. The present study analysed cost-effectiveness of an "In-house" magnetic resonance imaging (MRI) scan facility of a large service hospital against outsourcing possibilities. Cost per unit scan was calculated by operating costing method and break-even volume was calculated. Then life-cycle cost analysis was performed to enable total cost visibility of the MRI scan in both "In-house" and "outsourcing of facility" configuration. Finally, cost-effectiveness analysis was performed to identify the more acceptable decision option. Total cost for performing unit MRI scan was found to be Rs 3,875 for scans without contrast and Rs 4,129 with contrast. On life-cycle cost analysis, net present value (NPV) of the "In-house" configuration was found to be Rs-(4,09,06,265) while that of "outsourcing of facility" configuration was Rs-(5,70,23,315). Subsequently, cost-effectiveness analysis across eight Figures of Merit showed the "In-house" facility to be the more acceptable option for the system. Every decision for acquiring high-end technology must be subjected to life-cycle cost analysis.

  15. Making choices in health: WHO guide to cost effectiveness analysis

    National Research Council Canada - National Science Library

    Tan Torres Edejer, Tessa

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 6. Uncertainty in cost-effectiveness analysis . . . . . . . . . . . . . . . . . . 73 7. 8. Policy uses of Generalized CEA...

  16. High-throughput analysis of endogenous fruit glycosyl hydrolases using a novel chromogenic hydrogel substrate assay

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Lausen, Thomas Frederik

    2017-01-01

    A broad range of enzyme activities can be found in a wide range of different fruits and fruiting bodies but there is a lack of methods where many samples can be handled in a high-throughput and efficient manner. In particular, plant polysaccharide degrading enzymes – glycosyl hydrolases (GHs) play...... led to a more profound understanding of the importance of GH activity and regulation, current methods for determining glycosyl hydrolase activity are lacking in throughput and fail to keep up with data output from transcriptome research. Here we present the use of a versatile, easy...

  17. Gene Expression Analysis of Escherichia Coli Grown in Miniaturized Bioreactor Platforms for High-Throughput Analysis of Growth and genomic Data

    DEFF Research Database (Denmark)

    Boccazzi, P.; Zanzotto, A.; Szita, Nicolas

    2005-01-01

    Combining high-throughput growth physiology and global gene expression data analysis is of significant value for integrating metabolism and genomics. We compared global gene expression using 500 ng of total RNA from Escherichia coli cultures grown in rich or defined minimal media in a miniaturize...... cultures using just 500 ng of total RNA indicate that high-throughput integration of growth physiology and genomics will be possible with novel biochemical platforms and improved detection technologies....

  18. High throughput proteomic analysis of the secretome in an explant model of articular cartilage inflammation

    Science.gov (United States)

    Clutterbuck, Abigail L.; Smith, Julia R.; Allaway, David; Harris, Pat; Liddell, Susan; Mobasheri, Ali

    2011-01-01

    This study employed a targeted high-throughput proteomic approach to identify the major proteins present in the secretome of articular cartilage. Explants from equine metacarpophalangeal joints were incubated alone or with interleukin-1beta (IL-1β, 10 ng/ml), with or without carprofen, a non-steroidal anti-inflammatory drug, for six days. After tryptic digestion of culture medium supernatants, resulting peptides were separated by HPLC and detected in a Bruker amaZon ion trap instrument. The five most abundant peptides in each MS scan were fragmented and the fragmentation patterns compared to mammalian entries in the Swiss-Prot database, using the Mascot search engine. Tryptic peptides originating from aggrecan core protein, cartilage oligomeric matrix protein (COMP), fibronectin, fibromodulin, thrombospondin-1 (TSP-1), clusterin (CLU), cartilage intermediate layer protein-1 (CILP-1), chondroadherin (CHAD) and matrix metalloproteinases MMP-1 and MMP-3 were detected. Quantitative western blotting confirmed the presence of CILP-1, CLU, MMP-1, MMP-3 and TSP-1. Treatment with IL-1β increased MMP-1, MMP-3 and TSP-1 and decreased the CLU precursor but did not affect CILP-1 and CLU levels. Many of the proteins identified have well-established extracellular matrix functions and are involved in early repair/stress responses in cartilage. This high throughput approach may be used to study the changes that occur in the early stages of osteoarthritis. PMID:21354348

  19. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    International Nuclear Information System (INIS)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2014-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication

  20. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Directory of Open Access Journals (Sweden)

    Rok Gaber

    2013-11-01

    Full Text Available To effectively fight against the human immunodeficiency virus infection/ acquired immunodeficiency syndrome (HIV/AIDS epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity.

  1. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Science.gov (United States)

    Gaber, Rok; Majerle, Andreja; Jerala, Roman; Benčina, Mojca

    2013-01-01

    To effectively fight against the human immunodeficiency virus infection/acquired immunodeficiency syndrome (HIV/AIDS) epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET)-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity. PMID:24287545

  2. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States); Gagnon, David [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Gjoerup, Ole [Molecular Oncology Research Institute, Tufts Medical Center, Boston, MA 02111 (United States); Archambault, Jacques [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Bullock, Peter A., E-mail: Peter.Bullock@tufts.edu [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States)

    2014-11-15

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication.

  3. Cost-effectiveness Analysis with Influence Diagrams.

    Science.gov (United States)

    Arias, M; Díez, F J

    2015-01-01

    Cost-effectiveness analysis (CEA) is used increasingly in medicine to determine whether the health benefit of an intervention is worth the economic cost. Decision trees, the standard decision modeling technique for non-temporal domains, can only perform CEA for very small problems. To develop a method for CEA in problems involving several dozen variables. We explain how to build influence diagrams (IDs) that explicitly represent cost and effectiveness. We propose an algorithm for evaluating cost-effectiveness IDs directly, i.e., without expanding an equivalent decision tree. The evaluation of an ID returns a set of intervals for the willingness to pay - separated by cost-effectiveness thresholds - and, for each interval, the cost, the effectiveness, and the optimal intervention. The algorithm that evaluates the ID directly is in general much more efficient than the brute-force method, which is in turn more efficient than the expansion of an equivalent decision tree. Using OpenMarkov, an open-source software tool that implements this algorithm, we have been able to perform CEAs on several IDs whose equivalent decision trees contain millions of branches. IDs can perform CEA on large problems that cannot be analyzed with decision trees.

  4. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  5. 3D-SURFER: software for high-throughput protein surface comparison and analysis.

    Science.gov (United States)

    La, David; Esquivel-Rodríguez, Juan; Venkatraman, Vishwesh; Li, Bin; Sael, Lee; Ueng, Stephen; Ahrendt, Steven; Kihara, Daisuke

    2009-11-01

    We present 3D-SURFER, a web-based tool designed to facilitate high-throughput comparison and characterization of proteins based on their surface shape. As each protein is effectively represented by a vector of 3D Zernike descriptors, comparison times for a query protein against the entire PDB take, on an average, only a couple of seconds. The web interface has been designed to be as interactive as possible with displays showing animated protein rotations, CATH codes and structural alignments using the CE program. In addition, geometrically interesting local features of the protein surface, such as pockets that often correspond to ligand binding sites as well as protrusions and flat regions can also be identified and visualized. 3D-SURFER is a web application that can be freely accessed from: http://dragon.bio.purdue.edu/3d-surfer dkihara@purdue.edu Supplementary data are available at Bioinformatics online.

  6. Cancer panomics: computational methods and infrastructure for integrative analysis of cancer high-throughput "omics" data

    DEFF Research Database (Denmark)

    Brunak, Søren; De La Vega, Francisco M.; Rätsch, Gunnar

    2014-01-01

    Targeted cancer treatment is becoming the goal of newly developed oncology medicines and has already shown promise in some spectacular cases such as the case of BRAF kinase inhibitors in BRAF-mutant (e.g. V600E) melanoma. These developments are driven by the advent of high-throughput sequencing......, which continues to drop in cost, and that has enabled the sequencing of the genome, transcriptome, and epigenome of the tumors of a large number of cancer patients in order to discover the molecular aberrations that drive the oncogenesis of several types of cancer. Applying these technologies...... in the clinic promises to transform cancer treatment by identifying therapeutic vulnerabilities of each patient's tumor. These approaches will need to address the panomics of cancer--the integration of the complex combination of patient-specific characteristics that drive the development of each person's tumor...

  7. High-throughput liquid chromatography for drug analysis in biological fluids: investigation of extraction column life.

    Science.gov (United States)

    Zeng, Wei; Fisher, Alison L; Musson, Donald G; Wang, Amy Qiu

    2004-07-05

    A novel method was developed and assessed to extend the lifetime of extraction columns of high-throughput liquid chromatography (HTLC) for bioanalysis of human plasma samples. In this method, a 15% acetic acid solution and 90% THF were respectively used as mobile phases to clean up the proteins in human plasma samples and residual lipids from the extraction and analytical columns. The 15% acetic acid solution weakens the interactions between proteins and the stationary phase of the extraction column and increases the protein solubility in the mobile phase. The 90% THF mobile phase prevents the accumulation of lipids and thus reduces the potential damage on the columns. Using this novel method, the extraction column lifetime has been extended to about 2000 direct plasma injections, and this is the first time that high concentration acetic acid and THF are used in HTLC for on-line cleanup and extraction column lifetime extension.

  8. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  9. The use of FTA cards for preserving unfixed cytological material for high-throughput molecular analysis.

    Science.gov (United States)

    Saieg, Mauro Ajaj; Geddie, William R; Boerner, Scott L; Liu, Ni; Tsao, Ming; Zhang, Tong; Kamel-Reid, Suzanne; da Cunha Santos, Gilda

    2012-06-25

    Novel high-throughput molecular technologies have made the collection and storage of cells and small tissue specimens a critical issue. The FTA card provides an alternative to cryopreservation for biobanking fresh unfixed cells. The current study compared the quality and integrity of the DNA obtained from 2 types of FTA cards (Classic and Elute) using 2 different extraction protocols ("Classic" and "Elute") and assessed the feasibility of performing multiplex mutational screening using fine-needle aspiration (FNA) biopsy samples. Residual material from 42 FNA biopsies was collected in the cards (21 Classic and 21 Elute cards). DNA was extracted using the Classic protocol for Classic cards and both protocols for Elute cards. Polymerase chain reaction for p53 (1.5 kilobase) and CARD11 (500 base pair) was performed to assess DNA integrity. Successful p53 amplification was achieved in 95.2% of the samples from the Classic cards and in 80.9% of the samples from the Elute cards using the Classic protocol and 28.5% using the Elute protocol (P = .001). All samples (both cards) could be amplified for CARD11. There was no significant difference in the DNA concentration or 260/280 purity ratio when the 2 types of cards were compared. Five samples were also successfully analyzed by multiplex MassARRAY spectrometry, with a mutation in KRAS found in 1 case. High molecular weight DNA was extracted from the cards in sufficient amounts and quality to perform high-throughput multiplex mutation assays. The results of the current study also suggest that FTA Classic cards preserve better DNA integrity for molecular applications compared with the FTA Elute cards. Copyright © 2012 American Cancer Society.

  10. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  11. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  12. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  13. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Directory of Open Access Journals (Sweden)

    Karolina Chwialkowska

    2017-11-01

    Full Text Available Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq. We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation

  14. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  15. elegantRingAnalysis An Interface for High-Throughput Analysis of Storage Ring Lattices Using elegant

    CERN Document Server

    Borland, Michael

    2005-01-01

    The code {\\tt elegant} is widely used for simulation of linacs for drivers for free-electron lasers. Less well known is that elegant is also a very capable code for simulation of storage rings. In this paper, we show a newly-developed graphical user interface that allows the user to easily take advantage of these capabilities. The interface is designed for use on a Linux cluster, providing very high throughput. It can also be used on a single computer. Among the features it gives access to are basic calculations (Twiss parameters, radiation integrals), phase-space tracking, nonlinear dispersion, dynamic aperture (on- and off-momentum), frequency map analysis, and collective effects (IBS, bunch-lengthening). Using a cluster, it is easy to get highly detailed dynamic aperture and frequency map results in a surprisingly short time.

  16. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Craig A Gedye

    Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell

  17. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Science.gov (United States)

    Gedye, Craig A; Hussain, Ali; Paterson, Joshua; Smrke, Alannah; Saini, Harleen; Sirskyj, Danylo; Pereira, Keira; Lobo, Nazleen; Stewart, Jocelyn; Go, Christopher; Ho, Jenny; Medrano, Mauricio; Hyatt, Elzbieta; Yuan, Julie; Lauriault, Stevan; Meyer, Mona; Kondratyev, Maria; van den Beucken, Twan; Jewett, Michael; Dirks, Peter; Guidos, Cynthia J; Danska, Jayne; Wang, Jean; Wouters, Bradly; Neel, Benjamin; Rottapel, Robert; Ailles, Laurie E

    2014-01-01

    Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC) platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell markers.

  18. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  19. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  20. High Throughput Petrochronology and Sedimentary Provenance Analysis by Automated Phase Mapping and LAICPMS

    Science.gov (United States)

    Vermeesch, Pieter; Rittner, Martin; Petrou, Ethan; Omma, Jenny; Mattinson, Chris; Garzanti, Eduardo

    2017-11-01

    The first step in most geochronological studies is to extract dateable minerals from the host rock, which is time consuming, removes textural context, and increases the chance for sample cross contamination. We here present a new method to rapidly perform in situ analyses by coupling a fast scanning electron microscope (SEM) with Energy Dispersive X-ray Spectrometer (EDS) to a Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LAICPMS) instrument. Given a polished hand specimen, a petrographic thin section, or a grain mount, Automated Phase Mapping (APM) by SEM/EDS produces chemical and mineralogical maps from which the X-Y coordinates of the datable minerals are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic analysis. We apply the APM + LAICPMS method to three igneous, metamorphic, and sedimentary case studies. In the first case study, a polished slab of granite from Guernsey was scanned for zircon, producing a 609 ± 8 Ma weighted mean age. The second case study investigates a paragneiss from an ultra high pressure terrane in the north Qaidam terrane (Qinghai, China). One hundred seven small (25 µm) metamorphic zircons were analyzed by LAICPMS to confirm a 419 ± 4 Ma age of peak metamorphism. The third and final case study uses APM + LAICPMS to generate a large provenance data set and trace the provenance of 25 modern sediments from Angola, documenting longshore drift of Orange River sediments over a distance of 1,500 km. These examples demonstrate that APM + LAICPMS is an efficient and cost effective way to improve the quantity and quality of geochronological data.

  1. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    Science.gov (United States)

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  2. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  3. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs.

    Science.gov (United States)

    Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo

    2008-01-23

    To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  4. Analysis of high-throughput biological data using their rank values.

    Science.gov (United States)

    Dembélé, Doulaye

    2018-01-01

    High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .

  5. Analysis of Active Methylotrophic Communities: When DNA-SIP Meets High-Throughput Technologies.

    Science.gov (United States)

    Taubert, Martin; Grob, Carolina; Howat, Alexandra M; Burns, Oliver J; Chen, Yin; Neufeld, Josh D; Murrell, J Colin

    2016-01-01

    Methylotrophs are microorganisms ubiquitous in the environment that can metabolize one-carbon (C1) compounds as carbon and/or energy sources. The activity of these prokaryotes impacts biogeochemical cycles within their respective habitats and can determine whether these habitats act as sources or sinks of C1 compounds. Due to the high importance of C1 compounds, not only in biogeochemical cycles, but also for climatic processes, it is vital to understand the contributions of these microorganisms to carbon cycling in different environments. One of the most challenging questions when investigating methylotrophs, but also in environmental microbiology in general, is which species contribute to the environmental processes of interest, or "who does what, where and when?" Metabolic labeling with C1 compounds substituted with (13)C, a technique called stable isotope probing, is a key method to trace carbon fluxes within methylotrophic communities. The incorporation of (13)C into the biomass of active methylotrophs leads to an increase in the molecular mass of their biomolecules. For DNA-based stable isotope probing (DNA-SIP), labeled and unlabeled DNA is separated by isopycnic ultracentrifugation. The ability to specifically analyze DNA of active methylotrophs from a complex background community by high-throughput sequencing techniques, i.e. targeted metagenomics, is the hallmark strength of DNA-SIP for elucidating ecosystem functioning, and a protocol is detailed in this chapter.

  6. Perchlorate reduction by hydrogen autotrophic bacteria and microbial community analysis using high-throughput sequencing.

    Science.gov (United States)

    Wan, Dongjin; Liu, Yongde; Niu, Zhenhua; Xiao, Shuhu; Li, Daorong

    2016-02-01

    Hydrogen autotrophic reduction of perchlorate have advantages of high removal efficiency and harmless to drinking water. But so far the reported information about the microbial community structure was comparatively limited, changes in the biodiversity and the dominant bacteria during acclimation process required detailed study. In this study, perchlorate-reducing hydrogen autotrophic bacteria were acclimated by hydrogen aeration from activated sludge. For the first time, high-throughput sequencing was applied to analyze changes in biodiversity and the dominant bacteria during acclimation process. The Michaelis-Menten model described the perchlorate reduction kinetics well. Model parameters q(max) and K(s) were 2.521-3.245 (mg ClO4(-)/gVSS h) and 5.44-8.23 (mg/l), respectively. Microbial perchlorate reduction occurred across at pH range 5.0-11.0; removal was highest at pH 9.0. The enriched mixed bacteria could use perchlorate, nitrate and sulfate as electron accepter, and the sequence of preference was: NO3(-) > ClO4(-) > SO4(2-). Compared to the feed culture, biodiversity decreased greatly during acclimation process, the microbial community structure gradually stabilized after 9 acclimation cycles. The Thauera genus related to Rhodocyclales was the dominated perchlorate reducing bacteria (PRB) in the mixed culture.

  7. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Science.gov (United States)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2015-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. PMID:25155200

  8. High-throughput DNA methylation analysis in anorexia nervosa confirms TNXB hypermethylation.

    Science.gov (United States)

    Kesselmeier, Miriam; Pütter, Carolin; Volckmar, Anna-Lena; Baurecht, Hansjörg; Grallert, Harald; Illig, Thomas; Ismail, Khadeeja; Ollikainen, Miina; Silén, Yasmina; Keski-Rahkonen, Anna; Bulik, Cynthia M; Collier, David A; Zeggini, Eleftheria; Hebebrand, Johannes; Scherag, André; Hinney, Anke

    2018-04-01

    Patients with anorexia nervosa (AN) are ideally suited to identify differentially methylated genes in response to starvation. We examined high-throughput DNA methylation derived from whole blood of 47 females with AN, 47 lean females without AN and 100 population-based females to compare AN with both controls. To account for different cell type compositions, we applied two reference-free methods (FastLMM-EWASher, RefFreeEWAS) and searched for consensus CpG sites identified by both methods. We used a validation sample of five monozygotic AN-discordant twin pairs. Fifty-one consensus sites were identified in AN vs. lean and 81 in AN vs. population-based comparisons. These sites have not been reported in AN methylation analyses, but for the latter comparison 54/81 sites showed directionally consistent differential methylation effects in the AN-discordant twins. For a single nucleotide polymorphism rs923768 in CSGALNACT1 a nearby site was nominally associated with AN. At the gene level, we confirmed hypermethylated sites at TNXB. We found support for a locus at NR1H3 in the AN vs. lean control comparison, but the methylation direction was opposite to the one previously reported. We confirm genes like TNXB previously described to comprise differentially methylated sites, and highlight further sites that might be specifically involved in AN starvation processes.

  9. High-throughput molecular analysis in lung cancer: insights into biology and potential clinical applications.

    Science.gov (United States)

    Ocak, S; Sos, M L; Thomas, R K; Massion, P P

    2009-08-01

    During the last decade, high-throughput technologies including genomic, epigenomic, transcriptomic and proteomic have been applied to further our understanding of the molecular pathogenesis of this heterogeneous disease, and to develop strategies that aim to improve the management of patients with lung cancer. Ultimately, these approaches should lead to sensitive, specific and noninvasive methods for early diagnosis, and facilitate the prediction of response to therapy and outcome, as well as the identification of potential novel therapeutic targets. Genomic studies were the first to move this field forward by providing novel insights into the molecular biology of lung cancer and by generating candidate biomarkers of disease progression. Lung carcinogenesis is driven by genetic and epigenetic alterations that cause aberrant gene function; however, the challenge remains to pinpoint the key regulatory control mechanisms and to distinguish driver from passenger alterations that may have a small but additive effect on cancer development. Epigenetic regulation by DNA methylation and histone modifications modulate chromatin structure and, in turn, either activate or silence gene expression. Proteomic approaches critically complement these molecular studies, as the phenotype of a cancer cell is determined by proteins and cannot be predicted by genomics or transcriptomics alone. The present article focuses on the technological platforms available and some proposed clinical applications. We illustrate herein how the "-omics" have revolutionised our approach to lung cancer biology and hold promise for personalised management of lung cancer.

  10. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  11. High-throughput transcriptome analysis of barley (Hordeum vulgare) exposed to excessive boron.

    Science.gov (United States)

    Tombuloglu, Guzin; Tombuloglu, Huseyin; Sakcali, M Serdal; Unver, Turgay

    2015-02-15

    Boron (B) is an essential micronutrient for optimum plant growth. However, above certain threshold B is toxic and causes yield loss in agricultural lands. While a number of studies were conducted to understand B tolerance mechanism, a transcriptome-wide approach for B tolerant barley is performed here for the first time. A high-throughput RNA-Seq (cDNA) sequencing technology (Illumina) was used with barley (Hordeum vulgare), yielding 208 million clean reads. In total, 256,874 unigenes were generated and assigned to known peptide databases: Gene Ontology (GO) (99,043), Swiss-Prot (38,266), Clusters of Orthologous Groups (COG) (26,250), and the Kyoto Encyclopedia of Genes and Genomes (KEGG) (36,860), as determined by BLASTx search. According to the digital gene expression (DGE) analyses, 16% and 17% of the transcripts were found to be differentially regulated in root and leaf tissues, respectively. Most of them were involved in cell wall, stress response, membrane, protein kinase and transporter mechanisms. Some of the genes detected as highly expressed in root tissue are phospholipases, predicted divalent heavy-metal cation transporters, formin-like proteins and calmodulin/Ca(2+)-binding proteins. In addition, chitin-binding lectin precursor, ubiquitin carboxyl-terminal hydrolase, and serine/threonine-protein kinase AFC2 genes were indicated to be highly regulated in leaf tissue upon excess B treatment. Some pathways, such as the Ca(2+)-calmodulin system, are activated in response to B toxicity. The differential regulation of 10 transcripts was confirmed by qRT-PCR, revealing the tissue-specific responses against B toxicity and their putative function in B-tolerance mechanisms. Copyright © 2014. Published by Elsevier B.V.

  12. High-throughput mutational analysis of TOR1A in primary dystonia

    Directory of Open Access Journals (Sweden)

    Truong Daniel D

    2009-03-01

    Full Text Available Abstract Background Although the c.904_906delGAG mutation in Exon 5 of TOR1A typically manifests as early-onset generalized dystonia, DYT1 dystonia is genetically and clinically heterogeneous. Recently, another Exon 5 mutation (c.863G>A has been associated with early-onset generalized dystonia and some ΔGAG mutation carriers present with late-onset focal dystonia. The aim of this study was to identify TOR1A Exon 5 mutations in a large cohort of subjects with mainly non-generalized primary dystonia. Methods High resolution melting (HRM was used to examine the entire TOR1A Exon 5 coding sequence in 1014 subjects with primary dystonia (422 spasmodic dysphonia, 285 cervical dystonia, 67 blepharospasm, 41 writer's cramp, 16 oromandibular dystonia, 38 other primary focal dystonia, 112 segmental dystonia, 16 multifocal dystonia, and 17 generalized dystonia and 250 controls (150 neurologically normal and 100 with other movement disorders. Diagnostic sensitivity and specificity were evaluated in an additional 8 subjects with known ΔGAG DYT1 dystonia and 88 subjects with ΔGAG-negative dystonia. Results HRM of TOR1A Exon 5 showed high (100% diagnostic sensitivity and specificity. HRM was rapid and economical. HRM reliably differentiated the TOR1A ΔGAG and c.863G>A mutations. Melting curves were normal in 250/250 controls and 1012/1014 subjects with primary dystonia. The two subjects with shifted melting curves were found to harbor the classic ΔGAG deletion: 1 a non-Jewish Caucasian female with childhood-onset multifocal dystonia and 2 an Ashkenazi Jewish female with adolescent-onset spasmodic dysphonia. Conclusion First, HRM is an inexpensive, diagnostically sensitive and specific, high-throughput method for mutation discovery. Second, Exon 5 mutations in TOR1A are rarely associated with non-generalized primary dystonia.

  13. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  14. Transcriptomic analysis of Petunia hybrida in response to salt stress using high throughput RNA sequencing.

    Directory of Open Access Journals (Sweden)

    Gonzalo H Villarino

    Full Text Available Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments.

  15. Transcriptomic analysis of Petunia hybrida in response to salt stress using high throughput RNA sequencing.

    Science.gov (United States)

    Villarino, Gonzalo H; Bombarely, Aureliano; Giovannoni, James J; Scanlon, Michael J; Mattson, Neil S

    2014-01-01

    Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl) disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN) http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments.

  16. An RNA-Based Fluorescent Biosensor for High-Throughput Analysis of the cGAS-cGAMP-STING Pathway.

    Science.gov (United States)

    Bose, Debojit; Su, Yichi; Marcus, Assaf; Raulet, David H; Hammond, Ming C

    2016-12-22

    In mammalian cells, the second messenger (2'-5',3'-5') cyclic guanosine monophosphate-adenosine monophosphate (2',3'-cGAMP), is produced by the cytosolic DNA sensor cGAMP synthase (cGAS), and subsequently bound by the stimulator of interferon genes (STING) to trigger interferon response. Thus, the cGAS-cGAMP-STING pathway plays a critical role in pathogen detection, as well as pathophysiological conditions including cancer and autoimmune disorders. However, studying and targeting this immune signaling pathway has been challenging due to the absence of tools for high-throughput analysis. We have engineered an RNA-based fluorescent biosensor that responds to 2',3'-cGAMP. The resulting "mix-and-go" cGAS activity assay shows excellent statistical reliability as a high-throughput screening (HTS) assay and distinguishes between direct and indirect cGAS inhibitors. Furthermore, the biosensor enables quantitation of 2',3'-cGAMP in mammalian cell lysates. We envision this biosensor-based assay as a resource to study the cGAS-cGAMP-STING pathway in the context of infectious diseases, cancer immunotherapy, and autoimmune diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. High-throughput quantitative biochemical characterization of algal biomass by NIR spectroscopy; multiple linear regression and multivariate linear regression analysis.

    Science.gov (United States)

    Laurens, L M L; Wolfrum, E J

    2013-12-18

    One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.

  18. Bacterial Pathogens and Community Composition in Advanced Sewage Treatment Systems Revealed by Metagenomics Analysis Based on High-Throughput Sequencing

    Science.gov (United States)

    Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying

    2015-01-01

    This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence. PMID:25938416

  19. Some Observations on Cost-Effectiveness Analysis in Education.

    Science.gov (United States)

    Geske, Terry G.

    1979-01-01

    The general nature of cost-effectiveness analysis is discussed, analytical frameworks for conducting cost-effectiveness studies are described, and some of the problems inherent in measuring educational costs and in assessing program effectiveness are addressed. (Author/IRT)

  20. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging.

    Science.gov (United States)

    Pandey, Piyush; Ge, Yufeng; Stoerger, Vincent; Schnable, James C

    2017-01-01

    Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo . These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N), phosphorus (P), potassium (K), magnesium (Mg), calcium (Ca), and sulfur (S), and micronutrients sodium (Na), iron (Fe), manganese (Mn), boron (B), copper (Cu), and zinc (Zn). Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [ R 2 = 0.93 and RPD (Ratio of Performance to Deviation) = 3.8]. All macronutrients were also quantified satisfactorily ( R 2 from 0.69 to 0.92, RPD from 1.62 to 3.62), with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy ( R 2 from 0.19 to 0.86, RPD from 1.09 to 2.69) than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily ( R 2 plant chemical traits. Future

  1. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Science.gov (United States)

    Pandey, Piyush; Ge, Yufeng; Stoerger, Vincent; Schnable, James C.

    2017-01-01

    Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N), phosphorus (P), potassium (K), magnesium (Mg), calcium (Ca), and sulfur (S), and micronutrients sodium (Na), iron (Fe), manganese (Mn), boron (B), copper (Cu), and zinc (Zn). Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation) = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62), with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69) than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 designing experiments to vary plant nutrients

  2. The simple fool's guide to population genomics via RNA-Seq: An introduction to high-throughput sequencing data analysis

    DEFF Research Database (Denmark)

    De Wit, P.; Pespeni, M.H.; Ladner, J.T.

    2012-01-01

    to Population Genomics via RNA-seq' (SFG), a document intended to serve as an easy-to-follow protocol, walking a user through one example of high-throughput sequencing data analysis of nonmodel organisms. It is by no means an exhaustive protocol, but rather serves as an introduction to the bioinformatic methods...... used in population genomics, enabling a user to gain familiarity with basic analysis steps. The SFG consists of two parts. This document summarizes the steps needed and lays out the basic themes for each and a simple approach to follow. The second document is the full SFG, publicly available at http://sfg.......stanford.edu, that includes detailed protocols for data processing and analysis, along with a repository of custom-made scripts and sample files. Steps included in the SFG range from tissue collection to de novo assembly, blast annotation, alignment, gene expression, functional enrichment, SNP detection, principal components...

  3. Pathway Processor 2.0: a web resource for pathway-based analysis of high-throughput data.

    Science.gov (United States)

    Beltrame, Luca; Bianco, Luca; Fontana, Paolo; Cavalieri, Duccio

    2013-07-15

    Pathway Processor 2.0 is a web application designed to analyze high-throughput datasets, including but not limited to microarray and next-generation sequencing, using a pathway centric logic. In addition to well-established methods such as the Fisher's test and impact analysis, Pathway Processor 2.0 offers innovative methods that convert gene expression into pathway expression, leading to the identification of differentially regulated pathways in a dataset of choice. Pathway Processor 2.0 is available as a web service at http://compbiotoolbox.fmach.it/pathwayProcessor/. Sample datasets to test the functionality can be used directly from the application. duccio.cavalieri@fmach.it Supplementary data are available at Bioinformatics online.

  4. Validation of a Microscale Extraction and High Throughput UHPLC-QTOF-MS Analysis Method for Huperzine A in Huperzia

    Science.gov (United States)

    Cuthbertson, Daniel; Piljac-Žegarac, Jasenka; Lange, Bernd Markus

    2011-01-01

    Herein we report on an improved method for the microscale extraction of huperzine A (HupA), an acetylcholinesterase-inhibiting alkaloid, from as little as 3 mg of tissue homogenate from the clubmoss Huperzia squarrosa (G. Forst.) Trevis with 99.95 % recovery. We also validated a novel UHPLC-QTOF-MS method for the high-throughput analysis of H. squarrosa extracts in only 6 min, which, in combination with the very low limit of detection (20 pg on column) and the wide linear range for quantification (20 to 10,000 pg on column), allow for a highly efficient screening of extracts containing varying amounts of HupA. Utilization of this methodology has the potential to conserve valuable plant resources. PMID:22275140

  5. High-throughput SHAPE analysis reveals structures in HIV-1 genomic RNA strongly conserved across distinct biological states.

    Directory of Open Access Journals (Sweden)

    Kevin A Wilkinson

    2008-04-01

    Full Text Available Replication and pathogenesis of the human immunodeficiency virus (HIV is tightly linked to the structure of its RNA genome, but genome structure in infectious virions is poorly understood. We invent high-throughput SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension technology, which uses many of the same tools as DNA sequencing, to quantify RNA backbone flexibility at single-nucleotide resolution and from which robust structural information can be immediately derived. We analyze the structure of HIV-1 genomic RNA in four biologically instructive states, including the authentic viral genome inside native particles. Remarkably, given the large number of plausible local structures, the first 10% of the HIV-1 genome exists in a single, predominant conformation in all four states. We also discover that noncoding regions functioning in a regulatory role have significantly lower (p-value < 0.0001 SHAPE reactivities, and hence more structure, than do viral coding regions that function as the template for protein synthesis. By directly monitoring protein binding inside virions, we identify the RNA recognition motif for the viral nucleocapsid protein. Seven structurally homologous binding sites occur in a well-defined domain in the genome, consistent with a role in directing specific packaging of genomic RNA into nascent virions. In addition, we identify two distinct motifs that are targets for the duplex destabilizing activity of this same protein. The nucleocapsid protein destabilizes local HIV-1 RNA structure in ways likely to facilitate initial movement both of the retroviral reverse transcriptase from its tRNA primer and of the ribosome in coding regions. Each of the three nucleocapsid interaction motifs falls in a specific genome domain, indicating that local protein interactions can be organized by the long-range architecture of an RNA. High-throughput SHAPE reveals a comprehensive view of HIV-1 RNA genome structure, and further

  6. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy.

    Science.gov (United States)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis G; De Francisci, Davide; Valle, Giorgio; Angelidaki, Irini

    2016-01-01

    Biogas production is an economically attractive technology that has gained momentum worldwide over the past years. Biogas is produced by a biologically mediated process, widely known as "anaerobic digestion." This process is performed by a specialized and complex microbial community, in which different members have distinct roles in the establishment of a collective organization. Deciphering the complex microbial community engaged in this process is interesting both for unraveling the network of bacterial interactions and for applicability potential to the derived knowledge. In this study, we dissect the bioma involved in anaerobic digestion by means of high throughput Illumina sequencing (~51 gigabases of sequence data), disclosing nearly one million genes and extracting 106 microbial genomes by a novel strategy combining two binning processes. Microbial phylogeny and putative taxonomy performed using >400 proteins revealed that the biogas community is a trove of new species. A new approach based on functional properties as per network representation was developed to assign roles to the microbial species. The organization of the anaerobic digestion microbiome is resembled by a funnel concept, in which the microbial consortium presents a progressive functional specialization while reaching the final step of the process (i.e., methanogenesis). Key microbial genomes encoding enzymes involved in specific metabolic pathways, such as carbohydrates utilization, fatty acids degradation, amino acids fermentation, and syntrophic acetate oxidation, were identified. Additionally, the analysis identified a new uncultured archaeon that was putatively related to Methanomassiliicoccales but surprisingly having a methylotrophic methanogenic pathway. This study is a pioneer research on the phylogenetic and functional characterization of the microbial community populating biogas reactors. By applying for the first time high-throughput sequencing and a novel binning strategy, the

  7. High throughput LC-MS/MS method for the simultaneous analysis of multiple vitamin D analytes in serum.

    Science.gov (United States)

    Jenkinson, Carl; Taylor, Angela E; Hassan-Smith, Zaki K; Adams, John S; Stewart, Paul M; Hewison, Martin; Keevil, Brian G

    2016-03-01

    Recent studies suggest that vitamin D-deficiency is linked to increased risk of common human health problems. To define vitamin D 'status' most routine analytical methods quantify one particular vitamin D metabolite, 25-hydroxyvitamin D3 (25OHD3). However, vitamin D is characterized by complex metabolic pathways, and simultaneous measurement of multiple vitamin D metabolites may provide a more accurate interpretation of vitamin D status. To address this we developed a high-throughput liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to analyse multiple vitamin D analytes, with particular emphasis on the separation of epimer metabolites. A supportive liquid-liquid extraction (SLE) and LC-MS/MS method was developed to quantify 10 vitamin D metabolites as well as separation of an interfering 7α-hydroxy-4-cholesten-3-one (7αC4) isobar (precursor of bile acid), and validated by analysis of human serum samples. In a cohort of 116 healthy subjects, circulating concentrations of 25-hydroxyvitamin D3 (25OHD3), 3-epi-25-hydroxyvitamin D3 (3-epi-25OHD3), 24,25-dihydroxyvitamin D3 (24R,25(OH)2D3), 1,25-dihydroxyvitamin D3 (1α,25(OH)2D3), and 25-hydroxyvitamin D2 (25OHD2) were quantifiable using 220μL of serum, with 25OHD3 and 24R,25(OH)2D3 showing significant seasonal variations. This high-throughput LC-MS/MS method provides a novel strategy for assessing the impact of vitamin D on human health and disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  9. Laser desorption mass spectrometry for high-throughput DNA analysis and its applications

    Science.gov (United States)

    Chen, C. H. Winston; Golovlev, Valeri V.; Taranenko, N. I.; Allman, S. L.; Isola, Narayana R.; Potter, N. T.; Matteson, K. J.; Chang, Linus Y.

    1999-05-01

    Laser desorption mass spectrometry (LDMS) has been developed for DNA sequencing, disease diagnosis, and DNA fingerprinting for forensic applications. With LDMS, the speed of DNA analysis can be much faster than conventional gel electrophoresis. No dye or radioactive tagging to DNA segments for detection is needed. LDMS is emerging as a new alternative technology for DNA analysis.

  10. Analysis of high-throughput plant image data with the information system IAP

    Directory of Open Access Journals (Sweden)

    Klukas Christian

    2012-06-01

    Full Text Available This work presents a sophisticated information system, the Integrated Analysis Platform (IAP, an approach supporting large-scale image analysis for different species and imaging systems. In its current form, IAP supports the investigation of Maize, Barley and Arabidopsis plants based on images obtained in different spectra.

  11. A new high-throughput LC-MS method for the analysis of complex fructan mixtures

    DEFF Research Database (Denmark)

    Verspreet, Joran; Hansen, Anders Holmgaard; Dornez, Emmie

    2014-01-01

    In this paper, a new liquid chromatography-mass spectrometry (LC-MS) method for the analysis of complex fructan mixtures is presented. In this method, columns with a trifunctional C18 alkyl stationary phase (T3) were used and their performance compared with that of a porous graphitized carbon (PGC...

  12. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. High-throughput phenotyping allows for QTL analysis of defense, symbiosis and development-related traits

    DEFF Research Database (Denmark)

    Hansen, Nina Eberhardtsen

    -throughput phenotyping of whole plants. Additionally, a system for automated confocal microscopy aiming at automated detection of infection thread formation as well as detection of lateral root and nodule primordia is being developed. The objective was to use both systems in genome wide association studies and mutant...... the analysis. Additional phenotyping of defense mutants revealed that MLO, which confers susceptibility towards Blumeria graminis in barley, is also a prime candidate for a S. trifoliorum susceptibility gene in Lotus....

  14. A Reference Viral Database (RVDB) To Enhance Bioinformatics Analysis of High-Throughput Sequencing for Novel Virus Detection.

    Science.gov (United States)

    Goodacre, Norman; Aljanahi, Aisha; Nandakumar, Subhiksha; Mikailov, Mike; Khan, Arifa S

    2018-01-01

    Detection of distantly related viruses by high-throughput sequencing (HTS) is bioinformatically challenging because of the lack of a public database containing all viral sequences, without abundant nonviral sequences, which can extend runtime and obscure viral hits. Our reference viral database (RVDB) includes all viral, virus-related, and virus-like nucleotide sequences (excluding bacterial viruses), regardless of length, and with overall reduced cellular sequences. Semantic selection criteria (SEM-I) were used to select viral sequences from GenBank, resulting in a first-generation viral database (VDB). This database was manually and computationally reviewed, resulting in refined, semantic selection criteria (SEM-R), which were applied to a new download of updated GenBank sequences to create a second-generation VDB. Viral entries in the latter were clustered at 98% by CD-HIT-EST to reduce redundancy while retaining high viral sequence diversity. The viral identity of the clustered representative sequences (creps) was confirmed by BLAST searches in NCBI databases and HMMER searches in PFAM and DFAM databases. The resulting RVDB contained a broad representation of viral families, sequence diversity, and a reduced cellular content; it includes full-length and partial sequences and endogenous nonretroviral elements, endogenous retroviruses, and retrotransposons. Testing of RVDBv10.2, with an in-house HTS transcriptomic data set indicated a significantly faster run for virus detection than interrogating the entirety of the NCBI nonredundant nucleotide database, which contains all viral sequences but also nonviral sequences. RVDB is publically available for facilitating HTS analysis, particularly for novel virus detection. It is meant to be updated on a regular basis to include new viral sequences added to GenBank. IMPORTANCE To facilitate bioinformatics analysis of high-throughput sequencing (HTS) data for the detection of both known and novel viruses, we have

  15. Big data scalability for high throughput processing and analysis of vehicle engineering data

    OpenAIRE

    Lu, Feng

    2017-01-01

    "Sympathy for Data" is a platform that is utilized for Big Data automation analytics. It is based on visual interface and workflow configurations. The main purpose of the platform is to reuse parts of code for structured analysis of vehicle engineering data. However, there are some performance issues on a single machine for processing a large amount of data in Sympathy for Data. There are also disk and CPU IO intensive issues when the data is oversized and the platform need fits comfortably i...

  16. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  17. Micropathogen Community Analysis in Hyalomma rufipes via High-Throughput Sequencing of Small RNAs

    Science.gov (United States)

    Luo, Jin; Liu, Min-Xuan; Ren, Qiao-Yun; Chen, Ze; Tian, Zhan-Cheng; Hao, Jia-Wei; Wu, Feng; Liu, Xiao-Cui; Luo, Jian-Xun; Yin, Hong; Wang, Hui; Liu, Guang-Yuan

    2017-01-01

    Ticks are important vectors in the transmission of a broad range of micropathogens to vertebrates, including humans. Because of the role of ticks in disease transmission, identifying and characterizing the micropathogen profiles of tick populations have become increasingly important. The objective of this study was to survey the micropathogens of Hyalomma rufipes ticks. Illumina HiSeq2000 technology was utilized to perform deep sequencing of small RNAs (sRNAs) extracted from field-collected H. rufipes ticks in Gansu Province, China. The resultant sRNA library data revealed that the surveyed tick populations produced reads that were homologous to St. Croix River Virus (SCRV) sequences. We also observed many reads that were homologous to microbial and/or pathogenic isolates, including bacteria, protozoa, and fungi. As part of this analysis, a phylogenetic tree was constructed to display the relationships among the homologous sequences that were identified. The study offered a unique opportunity to gain insight into the micropathogens of H. rufipes ticks. The effective control of arthropod vectors in the future will require knowledge of the micropathogen composition of vectors harboring infectious agents. Understanding the ecological factors that regulate vector propagation in association with the prevalence and persistence of micropathogen lineages is also imperative. These interactions may affect the evolution of micropathogen lineages, especially if the micropathogens rely on the vector or host for dispersal. The sRNA deep-sequencing approach used in this analysis provides an intuitive method to survey micropathogen prevalence in ticks and other vector species. PMID:28861401

  18. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Malia A. Gehan

    2017-12-01

    Full Text Available Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  19. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  20. μTAS (micro total analysis systems) for the high-throughput measurement of nanomaterial solubility

    International Nuclear Information System (INIS)

    Tantra, R; Jarman, J

    2013-01-01

    There is a consensus in the nanoecotoxicology community that better analytical tools i.e. faster and more accurate ones, are needed for the physicochemical characterisation of nanomaterials in environmentally/biologically relevant media. In this study, we introduce the concept of μTAS (Micro Total Analysis Systems), which was a term coined to encapsulate the integration of laboratory processes on a single microchip. Our focus here is on the use of a capillary electrophoresis (CE) with conductivity detection microchip and how this may be used for the measurement of dissolution of metal oxide nanomaterials. Our preliminary results clearly show promise in that the device is able to: a) measure ionic zinc in various ecotox media with high selectivity b) track the dynamic dissolution events of zinc oxide (ZnO) nanomaterial when dispersed in fish medium.

  1. Peptide Pattern Recognition for high-throughput protein sequence analysis and clustering

    DEFF Research Database (Denmark)

    Busk, Peter Kamp

    2017-01-01

    Large collections of protein sequences with divergent sequences are tedious to analyze for understanding their phylogenetic or structure-function relation. Peptide Pattern Recognition is an algorithm that was developed to facilitate this task but the previous version does only allow a limited...... number of sequences as input. I implemented Peptide Pattern Recognition as a multithread software designed to handle large numbers of sequences and perform analysis in a reasonable time frame. Benchmarking showed that the new implementation of Peptide Pattern Recognition is twenty times faster than...... the previous implementation on a small protein collection with 673 MAP kinase sequences. In addition, the new implementation could analyze a large protein collection with 48,570 Glycosyl Transferase family 20 sequences without reaching its upper limit on a desktop computer. Peptide Pattern Recognition...

  2. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  3. Designing small universal k-mer hitting sets for improved analysis of high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Yaron Orenstein

    2017-10-01

    Full Text Available With the rapidly increasing volume of deep sequencing data, more efficient algorithms and data structures are needed. Minimizers are a central recent paradigm that has improved various sequence analysis tasks, including hashing for faster read overlap detection, sparse suffix arrays for creating smaller indexes, and Bloom filters for speeding up sequence search. Here, we propose an alternative paradigm that can lead to substantial further improvement in these and other tasks. For integers k and L > k, we say that a set of k-mers is a universal hitting set (UHS if every possible L-long sequence must contain a k-mer from the set. We develop a heuristic called DOCKS to find a compact UHS, which works in two phases: The first phase is solved optimally, and for the second we propose several efficient heuristics, trading set size for speed and memory. The use of heuristics is motivated by showing the NP-hardness of a closely related problem. We show that DOCKS works well in practice and produces UHSs that are very close to a theoretical lower bound. We present results for various values of k and L and by applying them to real genomes show that UHSs indeed improve over minimizers. In particular, DOCKS uses less than 30% of the 10-mers needed to span the human genome compared to minimizers. The software and computed UHSs are freely available at github.com/Shamir-Lab/DOCKS/ and acgt.cs.tau.ac.il/docks/, respectively.

  4. Quantitative proteomic analysis for high-throughput screening of differential glycoproteins in hepatocellular carcinoma serum

    International Nuclear Information System (INIS)

    Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing

    2015-01-01

    Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC

  5. Region Templates: Data Representation and Management for High-Throughput Image Analysis.

    Science.gov (United States)

    Teodoro, George; Pan, Tony; Kurc, Tahsin; Kong, Jun; Cooper, Lee; Klasky, Scott; Saltz, Joel

    2014-12-01

    We introduce a region template abstraction and framework for the efficient storage, management and processing of common data types in analysis of large datasets of high resolution images on clusters of hybrid computing nodes. The region template abstraction provides a generic container template for common data structures, such as points, arrays, regions, and object sets, within a spatial and temporal bounding box. It allows for different data management strategies and I/O implementations, while providing a homogeneous, unified interface to applications for data storage and retrieval. A region template application is represented as a hierarchical dataflow in which each computing stage may be represented as another dataflow of finer-grain tasks. The execution of the application is coordinated by a runtime system that implements optimizations for hybrid machines, including performance-aware scheduling for maximizing the utilization of computing devices and techniques to reduce the impact of data transfers between CPUs and GPUs. An experimental evaluation on a state-of-the-art hybrid cluster using a microscopy imaging application shows that the abstraction adds negligible overhead (about 3%) and achieves good scalability and high data transfer rates. Optimizations in a high speed disk based storage implementation of the abstraction to support asynchronous data transfers and computation result in an application performance gain of about 1.13×. Finally, a processing rate of 11,730 4K×4K tiles per minute was achieved for the microscopy imaging application on a cluster with 100 nodes (300 GPUs and 1,200 CPU cores). This computation rate enables studies with very large datasets.

  6. High-Throughput Genetic Analysis and Combinatorial Chiral Separations Based on Capillary Electrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Zhong, Wenwan [Iowa State Univ., Ames, IA (United States)

    2003-01-01

    Capillary electrophoresis (CE) offers many advantages over conventional analytical methods, such as speed, simplicity, high resolution, low cost, and small sample consumption, especially for the separation of enantiomers. However, chiral method developments still can be time consuming and tedious. They designed a comprehensive enantioseparation protocol employing neutral and sulfated cyclodextrins as chiral selectors for common basic, neutral, and acidic compounds with a 96-capillary array system. By using only four judiciously chosen separation buffers, successful enantioseparations were achieved for 49 out of 54 test compounds spanning a large variety of pKs and structures. Therefore, unknown compounds can be screened in this manner to identify optimal enantioselective conditions in just one rn. In addition to superior separation efficiency for small molecules, CE is also the most powerful technique for DNA separations. Using the same multiplexed capillary system with UV absorption detection, the sequence of a short DNA template can be acquired without any dye-labels. Two internal standards were utilized to adjust the migration time variations among capillaries, so that the four electropherograms for the A, T, C, G Sanger reactions can be aligned and base calling can be completed with a high level of confidence. the CE separation of DNA can be applied to study differential gene expression as well. Combined with pattern recognition techniques, small variations among electropherograms obtained by the separation of cDNA fragments produced from the total RNA samples of different human tissues can be revealed. These variations reflect the differences in total RNA expression among tissues. Thus, this Ce-based approach can serve as an alternative to the DNA array techniques in gene expression analysis.

  7. Live imaging of muscles in Drosophila metamorphosis: Towards high-throughput gene identification and function analysis.

    Science.gov (United States)

    Puah, Wee Choo; Wasser, Martin

    2016-03-01

    Time-lapse microscopy in developmental biology is an emerging tool for functional genomics. Phenotypic effects of gene perturbations can be studied non-invasively at multiple time points in chronological order. During metamorphosis of Drosophila melanogaster, time-lapse microscopy using fluorescent reporters allows visualization of alternative fates of larval muscles, which are a model for the study of genes related to muscle wasting. While doomed muscles enter hormone-induced programmed cell death, a smaller population of persistent muscles survives to adulthood and undergoes morphological remodeling that involves atrophy in early, and hypertrophy in late pupation. We developed a method that combines in vivo imaging, targeted gene perturbation and image analysis to identify and characterize genes involved in muscle development. Macrozoom microscopy helps to screen for interesting muscle phenotypes, while confocal microscopy in multiple locations over 4-5 days produces time-lapse images that are used to quantify changes in cell morphology. Performing a similar investigation using fixed pupal tissues would be too time-consuming and therefore impractical. We describe three applications of our pipeline. First, we show how quantitative microscopy can track and measure morphological changes of muscle throughout metamorphosis and analyze genes involved in atrophy. Second, our assay can help to identify genes that either promote or prevent histolysis of abdominal muscles. Third, we apply our approach to test new fluorescent proteins as live markers for muscle development. We describe mKO2 tagged Cysteine proteinase 1 (Cp1) and Troponin-I (TnI) as examples of proteins showing developmental changes in subcellular localization. Finally, we discuss strategies to improve throughput of our pipeline to permit genome-wide screens in the future. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  8. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    Science.gov (United States)

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    Science.gov (United States)

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  10. MSP-HTPrimer: a high-throughput primer design tool to improve assay design for DNA methylation analysis in epigenetics.

    Science.gov (United States)

    Pandey, Ram Vinay; Pulverer, Walter; Kallmeyer, Rainer; Beikircher, Gabriel; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Bisulfite (BS) conversion-based and methylation-sensitive restriction enzyme (MSRE)-based PCR methods have been the most commonly used techniques for locus-specific DNA methylation analysis. However, both methods have advantages and limitations. Thus, an integrated approach would be extremely useful to quantify the DNA methylation status successfully with great sensitivity and specificity. Designing specific and optimized primers for target regions is the most critical and challenging step in obtaining the adequate DNA methylation results using PCR-based methods. Currently, no integrated, optimized, and high-throughput methylation-specific primer design software methods are available for both BS- and MSRE-based methods. Therefore an integrated, powerful, and easy-to-use methylation-specific primer design pipeline with great accuracy and success rate will be very useful. We have developed a new web-based pipeline, called MSP-HTPrimer, to design primers pairs for MSP, BSP, pyrosequencing, COBRA, and MSRE assays on both genomic strands. First, our pipeline converts all target sequences into bisulfite-treated templates for both forward and reverse strand and designs all possible primer pairs, followed by filtering for single nucleotide polymorphisms (SNPs) and known repeat regions. Next, each primer pairs are annotated with the upstream and downstream RefSeq genes, CpG island, and cut sites (for COBRA and MSRE). Finally, MSP-HTPrimer selects specific primers from both strands based on custom and user-defined hierarchical selection criteria. MSP-HTPrimer produces a primer pair summary output table in TXT and HTML format for display and UCSC custom tracks for resulting primer pairs in GTF format. MSP-HTPrimer is an integrated, web-based, and high-throughput pipeline and has no limitation on the number and size of target sequences and designs MSP, BSP, pyrosequencing, COBRA, and MSRE assays. It is the only pipeline, which automatically designs primers on both genomic

  11. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles

    Science.gov (United States)

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  12. Hydrogel Based 3-Dimensional (3D System for Toxicity and High-Throughput (HTP Analysis for Cultured Murine Ovarian Follicles.

    Directory of Open Access Journals (Sweden)

    Hong Zhou

    Full Text Available Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN, preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR. The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased

  13. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP experiments.

    Directory of Open Access Journals (Sweden)

    Nicholas D Youngblut

    Full Text Available Combining high throughput sequencing with stable isotope probing (HTS-SIP is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP, multi-window high-resolution stable isotope probing (MW-HR-SIP, quantitative stable isotope probing (qSIP, and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  14. In situ analysis and structural elucidation of sainfoin (Onobrychis viciifolia) tannins for high-throughput germplasm screening.

    Science.gov (United States)

    Gea, An; Stringano, Elisabetta; Brown, Ron H; Mueller-Harvey, Irene

    2011-01-26

    A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin ( Onobrychis viciifolia ) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6-113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.

  15. High-throughput metagenomic analysis of petroleum-contaminated soil microbiome reveals the versatility in xenobiotic aromatics metabolism.

    Science.gov (United States)

    Bao, Yun-Juan; Xu, Zixiang; Li, Yang; Yao, Zhi; Sun, Jibin; Song, Hui

    2017-06-01

    The soil with petroleum contamination is one of the most studied soil ecosystems due to its rich microorganisms for hydrocarbon degradation and broad applications in bioremediation. However, our understanding of the genomic properties and functional traits of the soil microbiome is limited. In this study, we used high-throughput metagenomic sequencing to comprehensively study the microbial community from petroleum-contaminated soils near Tianjin Dagang oilfield in eastern China. The analysis reveals that the soil metagenome is characterized by high level of community diversity and metabolic versatility. The metageome community is predominated by γ-Proteobacteria and α-Proteobacteria, which are key players for petroleum hydrocarbon degradation. The functional study demonstrates over-represented enzyme groups and pathways involved in degradation of a broad set of xenobiotic aromatic compounds, including toluene, xylene, chlorobenzoate, aminobenzoate, DDT, methylnaphthalene, and bisphenol. A composite metabolic network is proposed for the identified pathways, thus consolidating our identification of the pathways. The overall data demonstrated the great potential of the studied soil microbiome in the xenobiotic aromatics degradation. The results not only establish a rich reservoir for novel enzyme discovery but also provide putative applications in bioremediation. Copyright © 2016. Published by Elsevier B.V.

  16. GxGrare: gene-gene interaction analysis method for rare variants from high-throughput sequencing data.

    Science.gov (United States)

    Kwon, Minseok; Leem, Sangseob; Yoon, Joon; Park, Taesung

    2018-03-19

    With the rapid advancement of array-based genotyping techniques, genome-wide association studies (GWAS) have successfully identified common genetic variants associated with common complex diseases. However, it has been shown that only a small proportion of the genetic etiology of complex diseases could be explained by the genetic factors identified from GWAS. This missing heritability could possibly be explained by gene-gene interaction (epistasis) and rare variants. There has been an exponential growth of gene-gene interaction analysis for common variants in terms of methodological developments and practical applications. Also, the recent advancement of high-throughput sequencing technologies makes it possible to conduct rare variant analysis. However, little progress has been made in gene-gene interaction analysis for rare variants. Here, we propose GxGrare which is a new gene-gene interaction method for the rare variants in the framework of the multifactor dimensionality reduction (MDR) analysis. The proposed method consists of three steps; 1) collapsing the rare variants, 2) MDR analysis for the collapsed rare variants, and 3) detect top candidate interaction pairs. GxGrare can be used for the detection of not only gene-gene interactions, but also interactions within a single gene. The proposed method is illustrated with 1080 whole exome sequencing data of the Korean population in order to identify causal gene-gene interaction for rare variants for type 2 diabetes. The proposed GxGrare performs well for gene-gene interaction detection with collapsing of rare variants. GxGrare is available at http://bibs.snu.ac.kr/software/gxgrare which contains simulation data and documentation. Supported operating systems include Linux and OS X.

  17. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yonghua [Iowa State Univ., Ames, IA (United States)

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  18. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  19. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    Science.gov (United States)

    Kinoshita, Manabu; Sakai, Mio; Arita, Hideyuki; Shofuda, Tomoko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Hashimoto, Naoya; Fujimoto, Yasunori; Yoshimine, Toshiki; Nakanishi, Katsuyuki; Kanemura, Yonehiro

    2016-01-01

    Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006) and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73). Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively). ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  20. High-throughput analysis for preparation, processing and analysis of TiO2 coatings on steel by chemical solution deposition

    International Nuclear Information System (INIS)

    Cuadrado Gil, Marcos; Van Driessche, Isabel; Van Gils, Sake; Lommens, Petra; Castelein, Pieter; De Buysser, Klaartje

    2012-01-01

    Highlights: ► High-throughput preparation of TiO 2 aqueous precursors. ► Analysis of stability and surface tension. ► Deposition of TiO 2 coatings. - Abstract: A high-throughput preparation, processing and analysis of titania coatings prepared by chemical solution deposition from water-based precursors at low temperature (≈250 °C) on two different types of steel substrates (Aluzinc® and bright annealed) is presented. The use of the high-throughput equipment allows fast preparation of multiple samples saving time, energy and material; and helps to test the scalability of the process. The process itself includes the use of IR curing for aqueous ceramic precursors and possibilities of using UV irradiation before the final sintering step. The IR curing method permits a much faster curing step compared to normal high temperature treatments in traditional convection devices (i.e., tube furnaces). The formulations, also prepared by high-throughput equipment, are found to be stable in the operational pH range of the substrates (6.5–8.5). Titanium alkoxides itself lack stability in pure water-based environments, but the presence of the different organic complexing agents prevents it from hydrolysis and precipitation reactions. The wetting interaction between the substrates and the various formulations is studied by the determination of the surface free energy of the substrates and the polar and dispersive components of the surface tension of the solutions. The mild temperature program used for preparation of the coatings however does not lead to the formation of pure crystalline material, necessary for the desired photocatalytic and super-hydrophilic behavior of these coatings. Nevertheless, some activity can be reported for these amorphous coatings by monitoring the discoloration of methylene blue in water under UV irradiation.

  1. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  2. Genetic analysis and gene mapping of a low stigma exposed mutant gene by high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Xiao Ma

    Full Text Available Rice is one of the main food crops and several studies have examined the molecular mechanism of the exposure of the rice plant stigma. The improvement in the exposure of the stigma in female parent hybrid combinations can enhance the efficiency of hybrid breeding. In the present study, a mutant plant with low exposed stigma (lesr was discovered among the descendants of the indica thermo-sensitive sterile line 115S. The ES% rate of the mutant decreased by 70.64% compared with the wild type variety. The F2 population was established by genetic analysis considering the mutant as the female parent and the restorer line 93S as the male parent. The results indicated a normal F1 population, while a clear division was noted for the high and low exposed stigma groups, respectively. This process was possible only by a ES of 25% in the F2 population. This was in agreement with the ratio of 3:1, which indicated that the mutant was controlled by a recessive main-effect QTL locus, temporarily named as LESR. Genome-wide comparison of the SNP profiles between the early, high and low production bulks were constructed from F2 plants using bulked segregant analysis in combination with high-throughput sequencing technology. The results demonstrated that the candidate loci was located on the chromosome 10 of the rice. Following screening of the recombinant rice plants with newly developed molecular markers, the genetic region was narrowed down to 0.25 Mb. This region was flanked by InDel-2 and InDel-2 at the physical location from 13.69 to 13.94 Mb. Within this region, 7 genes indicated base differences between parents. A total of 2 genes exhibited differences at the coding region and upstream of the coding region, respectively. The present study aimed to further clone the LESR gene, verify its function and identify the stigma variation.

  3. BioVLAB-MMIA-NGS: microRNA-mRNA integrated analysis using high-throughput sequencing data.

    Science.gov (United States)

    Chae, Heejoon; Rhee, Sungmin; Nephew, Kenneth P; Kim, Sun

    2015-01-15

    It is now well established that microRNAs (miRNAs) play a critical role in regulating gene expression in a sequence-specific manner, and genome-wide efforts are underway to predict known and novel miRNA targets. However, the integrated miRNA-mRNA analysis remains a major computational challenge, requiring powerful informatics systems and bioinformatics expertise. The objective of this study was to modify our widely recognized Web server for the integrated mRNA-miRNA analysis (MMIA) and its subsequent deployment on the Amazon cloud (BioVLAB-MMIA) to be compatible with high-throughput platforms, including next-generation sequencing (NGS) data (e.g. RNA-seq). We developed a new version called the BioVLAB-MMIA-NGS, deployed on both Amazon cloud and on a high-performance publicly available server called MAHA. By using NGS data and integrating various bioinformatics tools and databases, BioVLAB-MMIA-NGS offers several advantages. First, sequencing data is more accurate than array-based methods for determining miRNA expression levels. Second, potential novel miRNAs can be detected by using various computational methods for characterizing miRNAs. Third, because miRNA-mediated gene regulation is due to hybridization of an miRNA to its target mRNA, sequencing data can be used to identify many-to-many relationship between miRNAs and target genes with high accuracy. http://epigenomics.snu.ac.kr/biovlab_mmia_ngs/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Nonlinear mixed effects dose response modeling in high throughput drug screens: application to melanoma cell line analysis.

    Science.gov (United States)

    Ding, Kuan-Fu; Petricoin, Emanuel F; Finlay, Darren; Yin, Hongwei; Hendricks, William P D; Sereduk, Chris; Kiefer, Jeffrey; Sekulic, Aleksandar; LoRusso, Patricia M; Vuori, Kristiina; Trent, Jeffrey M; Schork, Nicholas J

    2018-01-12

    Cancer cell lines are often used in high throughput drug screens (HTS) to explore the relationship between cell line characteristics and responsiveness to different therapies. Many current analysis methods infer relationships by focusing on one aspect of cell line drug-specific dose-response curves (DRCs), the concentration causing 50% inhibition of a phenotypic endpoint (IC 50 ). Such methods may overlook DRC features and do not simultaneously leverage information about drug response patterns across cell lines, potentially increasing false positive and negative rates in drug response associations. We consider the application of two methods, each rooted in nonlinear mixed effects (NLME) models, that test the relationship relationships between estimated cell line DRCs and factors that might mitigate response. Both methods leverage estimation and testing techniques that consider the simultaneous analysis of different cell lines to draw inferences about any one cell line. One of the methods is designed to provide an omnibus test of the differences between cell line DRCs that is not focused on any one aspect of the DRC (such as the IC 50 value). We simulated different settings and compared the different methods on the simulated data. We also compared the proposed methods against traditional IC 50 -based methods using 40 melanoma cell lines whose transcriptomes, proteomes, and, importantly, BRAF and related mutation profiles were available. Ultimately, we find that the NLME-based methods are more robust, powerful and, for the omnibus test, more flexible, than traditional methods. Their application to the melanoma cell lines reveals insights into factors that may be clinically useful.

  5. A community resource for high-throughput quantitative RT-PCR analysis of transcription factor gene expression in Medicago truncatula

    Directory of Open Access Journals (Sweden)

    Redman Julia C

    2008-07-01

    Full Text Available Abstract Background Medicago truncatula is a model legume species that is currently the focus of an international genome sequencing effort. Although several different oligonucleotide and cDNA arrays have been produced for genome-wide transcript analysis of this species, intrinsic limitations in the sensitivity of hybridization-based technologies mean that transcripts of genes expressed at low-levels cannot be measured accurately with these tools. Amongst such genes are many encoding transcription factors (TFs, which are arguably the most important class of regulatory proteins. Quantitative reverse transcription-polymerase chain reaction (qRT-PCR is the most sensitive method currently available for transcript quantification, and one that can be scaled up to analyze transcripts of thousands of genes in parallel. Thus, qRT-PCR is an ideal method to tackle the problem of TF transcript quantification in Medicago and other plants. Results We established a bioinformatics pipeline to identify putative TF genes in Medicago truncatula and to design gene-specific oligonucleotide primers for qRT-PCR analysis of TF transcripts. We validated the efficacy and gene-specificity of over 1000 TF primer pairs and utilized these to identify sets of organ-enhanced TF genes that may play important roles in organ development or differentiation in this species. This community resource will be developed further as more genome sequence becomes available, with the ultimate goal of producing validated, gene-specific primers for all Medicago TF genes. Conclusion High-throughput qRT-PCR using a 384-well plate format enables rapid, flexible, and sensitive quantification of all predicted Medicago transcription factor mRNAs. This resource has been utilized recently by several groups in Europe, Australia, and the USA, and we expect that it will become the 'gold-standard' for TF transcript profiling in Medicago truncatula.

  6. Automation in Cytomics: A Modern RDBMS Based Platform for Image Analysis and Management in High-Throughput Screening Experiments

    NARCIS (Netherlands)

    E. Larios (Enrique); Y. Zhang (Ying); K. Yan (Kuan); Z. Di; S. LeDévédec (Sylvia); F.E. Groffen (Fabian); F.J. Verbeek

    2012-01-01

    textabstractIn cytomics bookkeeping of the data generated during lab experiments is crucial. The current approach in cytomics is to conduct High-Throughput Screening (HTS) experiments so that cells can be tested under many different experimental conditions. Given the large amount of different

  7. Cost-effective analysis of PET application in NSCLC

    International Nuclear Information System (INIS)

    Gu Aichun; Liu Jianjun; Sun Xiaoguang; Shi Yiping; Huang Gang

    2006-01-01

    Objective: To evaluate the cost-effectiveness of PET and CT application for diagnosis of non-small cell lung cancer (NSCLC) in China. Methods: Using decision analysis method the diagnostic efficiency of PET and CT for diagnosis of NSCLC in china was analysed. And also the value of cost for accurate diagnosis (CAD), cost for accurate staging (CAS) and cost for effective therapy (CAT) was calculated. Results: (1) For the accurate diagnosis, CT was much more cost-effective than PET. (2) For the accurate staging, CT was still more cost-effective than PET. (3) For the all over diagnostic and therapeutic cost, PET was more cost-effective than CT. (4) The priority of PET to CT was for the diagnosis of stage I NSCLC. Conclusion: For the management of NSCLC patient in China, CT is more cost-effective for screening, whereas PET for clinical staging and monitoring therapeutic effect. (authors)

  8. Metabolomic and high-throughput sequencing analysis – modern approach for the assessment of biodeterioration of materials from historic buildings

    Directory of Open Access Journals (Sweden)

    Beata eGutarowska

    2015-09-01

    Full Text Available Preservation of cultural heritage is of paramount importance worldwide. Microbial colonization of construction materials, such as wood, brick, mortar and stone in historic buildings can lead to severe deterioration. The aim of the present study was to give modern insight into the phylogenetic diversity and activated metabolic pathways of microbial communities colonized historic objects located in the former Auschwitz II-Birkenau concentration and extermination camp in Oświęcim, Poland. For this purpose we combined molecular, microscopic and chemical methods. Selected specimens were examined using Field Emission Scanning Electron Microscopy (FESEM, metabolomic analysis and high-throughput Illumina sequencing. FESEM imaging revealed the presence of complex microbial communities comprising diatoms, fungi and bacteria, mainly cyanobacteria and actinobacteria, on sample surfaces. Microbial diversity of brick specimens appeared higher than that of the wood and was dominated by algae and cyanobacteria, while wood was mainly colonized by fungi. DNA sequences documented the presence of 15 bacterial phyla representing 99 genera including Halomonas, Halorhodospira, Salinisphaera, Salinibacterium, Rubrobacter, Streptomyces, Arthrobacter and 9 fungal classes represented by 113 genera including Cladosporium, Acremonium, Alternaria, Engyodontium, Penicillium, Rhizopus and Aureobasidium. Most of the identified sequences were characteristic of organisms implicated in deterioration of wood and brick. Metabolomic data indicated the activation of numerous metabolic pathways, including those regulating the production of primary and secondary metabolites, for example, metabolites associated with the production of antibiotics, organic acids and deterioration of organic compounds. The study demonstrated that a combination of electron microscopy imaging with metabolomic and genomic techniques allows to link the phylogenetic information and metabolic profiles of

  9. Metabolomic and high-throughput sequencing analysis-modern approach for the assessment of biodeterioration of materials from historic buildings.

    Science.gov (United States)

    Gutarowska, Beata; Celikkol-Aydin, Sukriye; Bonifay, Vincent; Otlewska, Anna; Aydin, Egemen; Oldham, Athenia L; Brauer, Jonathan I; Duncan, Kathleen E; Adamiak, Justyna; Sunner, Jan A; Beech, Iwona B

    2015-01-01

    Preservation of cultural heritage is of paramount importance worldwide. Microbial colonization of construction materials, such as wood, brick, mortar, and stone in historic buildings can lead to severe deterioration. The aim of the present study was to give modern insight into the phylogenetic diversity and activated metabolic pathways of microbial communities colonized historic objects located in the former Auschwitz II-Birkenau concentration and extermination camp in Oświecim, Poland. For this purpose we combined molecular, microscopic and chemical methods. Selected specimens were examined using Field Emission Scanning Electron Microscopy (FESEM), metabolomic analysis and high-throughput Illumina sequencing. FESEM imaging revealed the presence of complex microbial communities comprising diatoms, fungi and bacteria, mainly cyanobacteria and actinobacteria, on sample surfaces. Microbial diversity of brick specimens appeared higher than that of the wood and was dominated by algae and cyanobacteria, while wood was mainly colonized by fungi. DNA sequences documented the presence of 15 bacterial phyla representing 99 genera including Halomonas, Halorhodospira, Salinisphaera, Salinibacterium, Rubrobacter, Streptomyces, Arthrobacter and nine fungal classes represented by 113 genera including Cladosporium, Acremonium, Alternaria, Engyodontium, Penicillium, Rhizopus, and Aureobasidium. Most of the identified sequences were characteristic of organisms implicated in deterioration of wood and brick. Metabolomic data indicated the activation of numerous metabolic pathways, including those regulating the production of primary and secondary metabolites, for example, metabolites associated with the production of antibiotics, organic acids and deterioration of organic compounds. The study demonstrated that a combination of electron microscopy imaging with metabolomic and genomic techniques allows to link the phylogenetic information and metabolic profiles of microbial communities

  10. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    Science.gov (United States)

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  11. Intestinal microbiota in healthy U.S. young children and adults--a high throughput microarray analysis.

    Directory of Open Access Journals (Sweden)

    Tamar Ringel-Kulka

    Full Text Available It is generally believed that the infant's microbiota is established during the first 1-2 years of life. However, there is scarce data on its characterization and its comparison to the adult-like microbiota in consecutive years.To characterize and compare the intestinal microbiota in healthy young children (1-4 years and healthy adults from the North Carolina region in the U.S. using high-throughput bacterial phylogenetic microarray analysis.Detailed characterization and comparison of the intestinal microbiota of healthy children aged 1-4 years old (n = 28 and healthy adults of 21-60 years (n = 23 was carried out using the Human Intestinal Tract Chip (HITChip phylogenetic microarray targeting the V1 and V6 regions of 16S rRNA and quantitative PCR.The HITChip microarray data indicate that Actinobacteria, Bacilli, Clostridium cluster IV and Bacteroidetes are the predominant phylum-like groups that exhibit differences between young children and adults. The phylum-like group Clostridium cluster XIVa was equally predominant in young children and adults and is thus considered to be established at an early age. The genus-like level show significant 3.6 fold (higher or lower differences in the abundance of 26 genera between young children and adults. Young U.S. children have a significantly 3.5-fold higher abundance of Bifidobacterium species than the adults from the same location. However, the microbiota of young children is less diverse than that of adults.We show that the establishment of an adult-like intestinal microbiota occurs at a later age than previously reported. Characterizing the microbiota and its development in the early years of life may help identify 'windows of opportunity' for interventional strategies that may promote health and prevent or mitigate disease processes.

  12. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  13. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    Directory of Open Access Journals (Sweden)

    Manabu Kinoshita

    Full Text Available Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006 and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73. Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively. ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p < 0.0001, AUC = 0.83, respectively. Finally, IDH1 wild type gliomas showed statistically lower Shannon entropy on T2WI than IDH1 mutated gliomas (p = 0.007 but no difference was observed between IDH1 wild type and mutated gliomas in Edge median values using Prewitt filtering. The current study introduced two image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  14. Comparing the normalization methods for the differential analysis of Illumina high-throughput RNA-Seq data.

    Science.gov (United States)

    Li, Peipei; Piao, Yongjun; Shon, Ho Sun; Ryu, Keun Ho

    2015-10-28

    Recently, rapid improvements in technology and decrease in sequencing costs have made RNA-Seq a widely used technique to quantify gene expression levels. Various normalization approaches have been proposed, owing to the importance of normalization in the analysis of RNA-Seq data. A comparison of recently proposed normalization methods is required to generate suitable guidelines for the selection of the most appropriate approach for future experiments. In this paper, we compared eight non-abundance (RC, UQ, Med, TMM, DESeq, Q, RPKM, and ERPKM) and two abundance estimation normalization methods (RSEM and Sailfish). The experiments were based on real Illumina high-throughput RNA-Seq of 35- and 76-nucleotide sequences produced in the MAQC project and simulation reads. Reads were mapped with human genome obtained from UCSC Genome Browser Database. For precise evaluation, we investigated Spearman correlation between the normalization results from RNA-Seq and MAQC qRT-PCR values for 996 genes. Based on this work, we showed that out of the eight non-abundance estimation normalization methods, RC, UQ, Med, TMM, DESeq, and Q gave similar normalization results for all data sets. For RNA-Seq of a 35-nucleotide sequence, RPKM showed the highest correlation results, but for RNA-Seq of a 76-nucleotide sequence, least correlation was observed than the other methods. ERPKM did not improve results than RPKM. Between two abundance estimation normalization methods, for RNA-Seq of a 35-nucleotide sequence, higher correlation was obtained with Sailfish than that with RSEM, which was better than without using abundance estimation methods. However, for RNA-Seq of a 76-nucleotide sequence, the results achieved by RSEM were similar to without applying abundance estimation methods, and were much better than with Sailfish. Furthermore, we found that adding a poly-A tail increased alignment numbers, but did not improve normalization results. Spearman correlation analysis revealed that RC, UQ

  15. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  16. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    International Nuclear Information System (INIS)

    Hui Su

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm(sub 2) for 40-(micro)m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection

  17. High-throughput tandem mass spectrometry multiplex analysis for newborn urinary screening of creatine synthesis and transport disorders, Triple H syndrome and OTC deficiency.

    Science.gov (United States)

    Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela

    2014-09-25

    Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. High-throughput analysis of candidate imprinted genes and allele-specific gene expression in the human term placenta

    Directory of Open Access Journals (Sweden)

    Clark Taane G

    2010-04-01

    Full Text Available Abstract Background Imprinted genes show expression from one parental allele only and are important for development and behaviour. This extreme mode of allelic imbalance has been described for approximately 56 human genes. Imprinting status is often disrupted in cancer and dysmorphic syndromes. More subtle variation of gene expression, that is not parent-of-origin specific, termed 'allele-specific gene expression' (ASE is more common and may give rise to milder phenotypic differences. Using two allele-specific high-throughput technologies alongside bioinformatics predictions, normal term human placenta was screened to find new imprinted genes and to ascertain the extent of ASE in this tissue. Results Twenty-three family trios of placental cDNA, placental genomic DNA (gDNA and gDNA from both parents were tested for 130 candidate genes with the Sequenom MassArray system. Six genes were found differentially expressed but none imprinted. The Illumina ASE BeadArray platform was then used to test 1536 SNPs in 932 genes. The array was enriched for the human orthologues of 124 mouse candidate genes from bioinformatics predictions and 10 human candidate imprinted genes from EST database mining. After quality control pruning, a total of 261 informative SNPs (214 genes remained for analysis. Imprinting with maternal expression was demonstrated for the lymphocyte imprinted gene ZNF331 in human placenta. Two potential differentially methylated regions (DMRs were found in the vicinity of ZNF331. None of the bioinformatically predicted candidates tested showed imprinting except for a skewed allelic expression in a parent-specific manner observed for PHACTR2, a neighbour of the imprinted PLAGL1 gene. ASE was detected for two or more individuals in 39 candidate genes (18%. Conclusions Both Sequenom and Illumina assays were sensitive enough to study imprinting and strong allelic bias. Previous bioinformatics approaches were not predictive of new imprinted genes

  19. Effectiveness of a high-throughput genetic analysis in the identification of responders/non-responders to CYP2D6-metabolized drugs.

    Science.gov (United States)

    Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A

    2011-01-01

    Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.

  20. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  1. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  2. Cost-effectiveness analysis of sandhill crane habitat management

    Science.gov (United States)

    Kessler, Andrew C.; Merchant, James W.; Shultz, Steven D.; Allen, Craig R.

    2013-01-01

    Invasive species often threaten native wildlife populations and strain the budgets of agencies charged with wildlife management. We demonstrate the potential of cost-effectiveness analysis to improve the efficiency and value of efforts to enhance sandhill crane (Grus canadensis) roosting habitat. We focus on the central Platte River in Nebraska (USA), a region of international ecological importance for migrating avian species including sandhill cranes. Cost-effectiveness analysis is a valuation process designed to compare alternative actions based on the cost of achieving a pre-determined objective. We estimated costs for removal of invasive vegetation using geographic information system simulations and calculated benefits as the increase in area of sandhill crane roosting habitat. We generated cost effectiveness values for removing invasive vegetation on 7 land parcels and for the entire central Platte River to compare the cost-effectiveness of management at specific sites and for the central Platte River landscape. Median cost effectiveness values for the 7 land parcels evaluated suggest that costs for creating 1 additional hectare of sandhill crane roosting habitat totaled US $1,595. By contrast, we found that creating an additional hectare of sandhill crane roosting habitat could cost as much as US $12,010 for some areas in the central Platte River, indicating substantial cost savings can be achieved by using a cost effectiveness analysis to target specific land parcels for management. Cost-effectiveness analysis, used in conjunction with geographic information systems, can provide decision-makers with a new tool for identifying the most economically efficient allocation of resources to achieve habitat management goals.

  3. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  4. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  5. Product Chemistry and Process Efficiency of Biomass Torrefaction, Pyrolysis and Gasification Studied by High-Throughput Techniques and Multivariate Analysis

    Science.gov (United States)

    Xiao, Li

    Despite the great passion and endless efforts on development of renewable energy from biomass, the commercialization and scale up of biofuel production is still under pressure and facing challenges. New ideas and facilities are being tested around the world targeting at reducing cost and improving product value. Cutting edge technologies involving analytical chemistry, statistics analysis, industrial engineering, computer simulation, and mathematics modeling, etc. keep integrating modern elements into this classic research. One of those challenges of commercializing biofuel production is the complexity from chemical composition of biomass feedstock and the products. Because of this, feedstock selection and process optimization cannot be conducted efficiently. This dissertation attempts to further evaluate biomass thermal decomposition process using both traditional methods and advanced technique (Pyrolysis Molecular Beam Mass Spectrometry). Focus has been made on data base generation of thermal decomposition products from biomass at different temperatures, finding out the relationship between traditional methods and advanced techniques, evaluating process efficiency and optimizing reaction conditions, comparison of typically utilized biomass feedstock and new search on innovative species for economical viable feedstock preparation concepts, etc. Lab scale quartz tube reactors and 80il stainless steel sample cups coupled with auto-sampling system were utilized to simulate the complicated reactions happened in real fluidized or entrained flow reactors. Two main high throughput analytical techniques used are Near Infrared Spectroscopy (NIR) and Pyrolysis Molecular Beam Mass Spectrometry (Py-MBMS). Mass balance, carbon balance, and product distribution are presented in detail. Variations of thermal decomposition temperature range from 200°C to 950°C. Feedstocks used in the study involve typical hardwood and softwood (red oak, white oak, yellow poplar, loblolly pine

  6. A Cost-Effectiveness Analysis of Early Literacy Interventions

    Science.gov (United States)

    Simon, Jessica

    2011-01-01

    Success in early literacy activities is associated with improved educational outcomes, including reduced dropout risk, in-grade retention, and special education referrals. When considering programs that will work for a particular school and context; cost-effectiveness analysis may provide useful information for decision makers. The study…

  7. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    2016-01-01

    Biogas production is an economically attractive technology that has gained momentum worldwide over the past years. Biogas is produced by a biologically mediated process, widely known as "anaerobic digestion." This process is performed by a specialized and complex microbial community, in which...... performed using >400 proteins revealed that the biogas community is a trove of new species. A new approach based on functional properties as per network representation was developed to assign roles to the microbial species. The organization of the anaerobic digestion microbiome is resembled by a funnel...... on the phylogenetic and functional characterization of the microbial community populating biogas reactors. By applying for the first time high-throughput sequencing and a novel binning strategy, the identified genes were anchored to single genomes providing a clear understanding of their metabolic pathways...

  8. Human Leukocyte Antigen Typing Using a Knowledge Base Coupled with a High-Throughput Oligonucleotide Probe Array Analysis

    Science.gov (United States)

    Zhang, Guang Lan; Keskin, Derin B.; Lin, Hsin-Nan; Lin, Hong Huang; DeLuca, David S.; Leppanen, Scott; Milford, Edgar L.; Reinherz, Ellis L.; Brusic, Vladimir

    2014-01-01

    Human leukocyte antigens (HLA) are important biomarkers because multiple diseases, drug toxicity, and vaccine responses reveal strong HLA associations. Current clinical HLA typing is an elimination process requiring serial testing. We present an alternative in situ synthesized DNA-based microarray method that contains hundreds of thousands of probes representing a complete overlapping set covering 1,610 clinically relevant HLA class I alleles accompanied by computational tools for assigning HLA type to 4-digit resolution. Our proof-of-concept experiment included 21 blood samples, 18 cell lines, and multiple controls. The method is accurate, robust, and amenable to automation. Typing errors were restricted to homozygous samples or those with very closely related alleles from the same locus, but readily resolved by targeted DNA sequencing validation of flagged samples. High-throughput HLA typing technologies that are effective, yet inexpensive, can be used to analyze the world’s populations, benefiting both global public health and personalized health care. PMID:25505899

  9. Cost Effectiveness Analysis of Optimal Malaria Control Strategies in Kenya

    Directory of Open Access Journals (Sweden)

    Gabriel Otieno

    2016-03-01

    Full Text Available Malaria remains a leading cause of mortality and morbidity among the children under five and pregnant women in sub-Saharan Africa, but it is preventable and controllable provided current recommended interventions are properly implemented. Better utilization of malaria intervention strategies will ensure the gain for the value for money and producing health improvements in the most cost effective way. The purpose of the value for money drive is to develop a better understanding (and better articulation of costs and results so that more informed, evidence-based choices could be made. Cost effectiveness analysis is carried out to inform decision makers on how to determine where to allocate resources for malaria interventions. This study carries out cost effective analysis of one or all possible combinations of the optimal malaria control strategies (Insecticide Treated Bednets—ITNs, Treatment, Indoor Residual Spray—IRS and Intermittent Preventive Treatment for Pregnant Women—IPTp for the four different transmission settings in order to assess the extent to which the intervention strategies are beneficial and cost effective. For the four different transmission settings in Kenya the optimal solution for the 15 strategies and their associated effectiveness are computed. Cost-effective analysis using Incremental Cost Effectiveness Ratio (ICER was done after ranking the strategies in order of the increasing effectiveness (total infections averted. The findings shows that for the endemic regions the combination of ITNs, IRS, and IPTp was the most cost-effective of all the combined strategies developed in this study for malaria disease control and prevention; for the epidemic prone areas is the combination of the treatment and IRS; for seasonal areas is the use of ITNs plus treatment; and for the low risk areas is the use of treatment only. Malaria transmission in Kenya can be minimized through tailor-made intervention strategies for malaria control

  10. Accurate CpG and non-CpG cytosine methylation analysis by high-throughput locus-specific pyrosequencing in plants.

    Science.gov (United States)

    How-Kit, Alexandre; Daunay, Antoine; Mazaleyrat, Nicolas; Busato, Florence; Daviaud, Christian; Teyssier, Emeline; Deleuze, Jean-François; Gallusci, Philippe; Tost, Jörg

    2015-07-01

    Pyrosequencing permits accurate quantification of DNA methylation of specific regions where the proportions of the C/T polymorphism induced by sodium bisulfite treatment of DNA reflects the DNA methylation level. The commercially available high-throughput locus-specific pyrosequencing instruments allow for the simultaneous analysis of 96 samples, but restrict the DNA methylation analysis to CpG dinucleotide sites, which can be limiting in many biological systems. In contrast to mammals where DNA methylation occurs nearly exclusively on CpG dinucleotides, plants genomes harbor DNA methylation also in other sequence contexts including CHG and CHH motives, which cannot be evaluated by these pyrosequencing instruments due to software limitations. Here, we present a complete pipeline for accurate CpG and non-CpG cytosine methylation analysis at single base-resolution using high-throughput locus-specific pyrosequencing. The devised approach includes the design and validation of PCR amplification on bisulfite-treated DNA and pyrosequencing assays as well as the quantification of the methylation level at every cytosine from the raw peak intensities of the Pyrograms by two newly developed Visual Basic Applications. Our method presents accurate and reproducible results as exemplified by the cytosine methylation analysis of the promoter regions of two Tomato genes (NOR and CNR) encoding transcription regulators of fruit ripening during different stages of fruit development. Our results confirmed a significant and temporally coordinated loss of DNA methylation on specific cytosines during the early stages of fruit development in both promoters as previously shown by WGBS. The manuscript describes thus the first high-throughput locus-specific DNA methylation analysis in plants using pyrosequencing.

  11. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  12. Pressure relieving support surfaces (PRESSURE) trial: cost effectiveness analysis.

    Science.gov (United States)

    Iglesias, Cynthia; Nixon, Jane; Cranny, Gillian; Nelson, E Andrea; Hawkins, Kim; Phillips, Angela; Torgerson, David; Mason, Su; Cullum, Nicky

    2006-06-17

    To assess the cost effectiveness of alternating pressure mattresses compared with alternating pressure overlays for the prevention of pressure ulcers in patients admitted to hospital. Cost effectiveness analysis carried out alongside the pressure relieving support surfaces (PRESSURE) trial; a multicentre UK based pragmatic randomised controlled trial. 11 hospitals in six UK NHS trusts. Intention to treat population comprising 1971 participants. Kaplan Meier estimates of restricted mean time to development of pressure ulcers and total costs for treatment in hospital. Alternating pressure mattresses were associated with lower overall costs (283.6 pounds sterling per patient on average, 95% confidence interval--377.59 pounds sterling to 976.79 pounds sterling) mainly due to reduced length of stay in hospital, and greater benefits (a delay in time to ulceration of 10.64 days on average,--24.40 to 3.09). The differences in health benefits and total costs for hospital stay between alternating pressure mattresses and alternating pressure overlays were not statistically significant; however, a cost effectiveness acceptability curve indicated that on average alternating pressure mattresses compared with alternating pressure overlays were associated with an 80% probability of being cost saving. Alternating pressure mattresses for the prevention of pressure ulcers are more likely to be cost effective and are more acceptable to patients than alternating pressure overlays.

  13. Rapid evaporative ionization mass spectrometry for high-throughput screening in food analysis: The case of boar taint.

    Science.gov (United States)

    Verplanken, Kaat; Stead, Sara; Jandova, Renata; Poucke, Christof Van; Claereboudt, Jan; Bussche, Julie Vanden; Saeger, Sarah De; Takats, Zoltan; Wauters, Jella; Vanhaecke, Lynn

    2017-07-01

    Boar taint is a contemporary off-odor present in meat of uncastrated male pigs. As European Member States intend to abandon surgical castration of pigs by 2018, this off-odor has gained a lot of research interest. In this study, rapid evaporative ionization mass spectrometry (REIMS) was explored for the rapid detection of boar taint in neck fat. Untargeted screening of samples (n=150) enabled discrimination between sow, tainted and untainted boars. The obtained OPLS-DA models showed excellent classification accuracy, i.e. 99% and 100% for sow and boar samples or solely boar samples, respectively. Furthermore, the obtained models demonstrated excellent validation characteristics (R 2 (Y)=0.872-0.969; Q 2 (Y)=0.756-0.917), which were confirmed by CV-ANOVA (phighly accurate and high-throughput (<10s) classification of tainted and untainted boar samples was achieved, rendering REIMS a promising technique for predictive modelling in food safety and quality applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. miRanalyzer: an update on the detection and analysis of microRNAs in high-throughput sequencing experiments

    Science.gov (United States)

    Hackenberg, Michael; Rodríguez-Ezpeleta, Naiara; Aransay, Ana M.

    2011-01-01

    We present a new version of miRanalyzer, a web server and stand-alone tool for the detection of known and prediction of new microRNAs in high-throughput sequencing experiments. The new version has been notably improved regarding speed, scope and available features. Alignments are now based on the ultrafast short-read aligner Bowtie (granting also colour space support, allowing mismatches and improving speed) and 31 genomes, including 6 plant genomes, can now be analysed (previous version contained only 7). Differences between plant and animal microRNAs have been taken into account for the prediction models and differential expression of both, known and predicted microRNAs, between two conditions can be calculated. Additionally, consensus sequences of predicted mature and precursor microRNAs can be obtained from multiple samples, which increases the reliability of the predicted microRNAs. Finally, a stand-alone version of the miRanalyzer that is based on a local and easily customized database is also available; this allows the user to have more control on certain parameters as well as to use specific data such as unpublished assemblies or other libraries that are not available in the web server. miRanalyzer is available at http://bioinfo2.ugr.es/miRanalyzer/miRanalyzer.php. PMID:21515631

  15. Comparative analysis and validation of the malachite green assay for the high throughput biochemical characterization of terpene synthases.

    Science.gov (United States)

    Vardakou, Maria; Salmon, Melissa; Faraldos, Juan A; O'Maille, Paul E

    2014-01-01

    Terpenes are the largest group of natural products with important and diverse biological roles, while of tremendous economic value as fragrances, flavours and pharmaceutical agents. Class-I terpene synthases (TPSs), the dominant type of TPS enzymes, catalyze the conversion of prenyl diphosphates to often structurally diverse bioactive terpene hydrocarbons, and inorganic pyrophosphate (PPi). To measure their kinetic properties, current bio-analytical methods typically rely on the direct detection of hydrocarbon products by radioactivity measurements or gas chromatography-mass spectrometry (GC-MS). In this study we employed an established, rapid colorimetric assay, the pyrophosphate/malachite green assay (MG), as an alternative means for the biochemical characterization of class I TPSs activity.•We describe the adaptation of the MG assay for turnover and catalytic efficiency measurements of TPSs.•We validate the method by direct comparison with established assays. The agreement of k cat/K M among methods makes this adaptation optimal for rapid evaluation of TPSs.•We demonstrate the application of the MG assay for the high-throughput screening of TPS gene libraries.

  16. High-throughput analysis of sulfatides in cerebrospinal fluid using automated extraction and UPLC-MS/MS.

    Science.gov (United States)

    Blomqvist, Maria; Borén, Jan; Zetterberg, Henrik; Blennow, Kaj; Månsson, Jan-Eric; Ståhlman, Marcus

    2017-07-01

    Sulfatides (STs) are a group of glycosphingolipids that are highly expressed in brain. Due to their importance for normal brain function and their potential involvement in neurological diseases, development of accurate and sensitive methods for their determination is needed. Here we describe a high-throughput oriented and quantitative method for the determination of STs in cerebrospinal fluid (CSF). The STs were extracted using a fully automated liquid/liquid extraction method and quantified using ultra-performance liquid chromatography coupled to tandem mass spectrometry. With the high sensitivity of the developed method, quantification of 20 ST species from only 100 μl of CSF was performed. Validation of the method showed that the STs were extracted with high recovery (90%) and could be determined with low inter- and intra-day variation. Our method was applied to a patient cohort of subjects with an Alzheimer's disease biomarker profile. Although the total ST levels were unaltered compared with an age-matched control group, we show that the ratio of hydroxylated/nonhydroxylated STs was increased in the patient cohort. In conclusion, we believe that the fast, sensitive, and accurate method described in this study is a powerful new tool for the determination of STs in clinical as well as preclinical settings. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.

  17. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  18. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    Science.gov (United States)

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  19. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  20. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    Science.gov (United States)

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  1. Temporal dynamics of soil microbial communities under different moisture regimes: high-throughput sequencing and bioinformatics analysis

    Science.gov (United States)

    Semenov, Mikhail; Zhuravleva, Anna; Semenov, Vyacheslav; Yevdokimov, Ilya; Larionova, Alla

    2017-04-01

    Recent climate scenarios predict not only continued global warming but also an increased frequency and intensity of extreme climatic events such as strong changes in temperature and precipitation regimes. Microorganisms are well known to be more sensitive to changes in environmental conditions than to other soil chemical and physical parameters. In this study, we determined the shifts in soil microbial community structure as well as indicative taxa in soils under three moisture regimes using high-throughput Illumina sequencing and range of bioinformatics approaches for the assessment of sequence data. Incubation experiments were performed in soil-filled (Greyic Phaeozems Albic) rhizoboxes with maize and without plants. Three contrasting moisture regimes were being simulated: 1) optimal wetting (OW), a watering 2-3 times per week to maintain soil moisture of 20-25% by weight; 2) periodic wetting (PW), with alternating periods of wetting and drought; and 3) constant insufficient wetting (IW), while soil moisture of 12% by weight was permanently maintained. Sampled fresh soils were homogenized, and the total DNA of three replicates was extracted using the FastDNA® SPIN kit for Soil. DNA replicates were combined in a pooled sample and the DNA was used for PCR with specific primers for the 16S V3 and V4 regions. In order to compare variability between different samples and replicates within a single sample, some DNA replicates treated separately. The products were purified and submitted to Illumina MiSeq sequencing. Sequence data were evaluated by alpha-diversity (Chao1 and Shannon H' diversity indexes), beta-diversity (UniFrac and Bray-Curtis dissimilarity), heatmap, tagcloud, and plot-bar analyses using the MiSeq Reporter Metagenomics Workflow and R packages (phyloseq, vegan, tagcloud). Shannon index varied in a rather narrow range (4.4-4.9) with the lowest values for microbial communities under PW treatment. Chao1 index varied from 385 to 480, being a more flexible

  2. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  3. High-throughput pseudovirion-based neutralization assay for analysis of natural and vaccine-induced antibodies against human papillomaviruses.

    Directory of Open Access Journals (Sweden)

    Peter Sehr

    Full Text Available A highly sensitive, automated, purely add-on, high-throughput pseudovirion-based neutralization assay (HT-PBNA with excellent repeatability and run-to-run reproducibility was developed for human papillomavirus types (HPV 16, 18, 31, 45, 52, 58 and bovine papillomavirus type 1. Preparation of 384 well assay plates with serially diluted sera and the actual cell-based assay are separated in time, therefore batches of up to one hundred assay plates can be processed sequentially. A mean coefficient of variation (CV of 13% was obtained for anti-HPV 16 and HPV 18 titers for a standard serum tested in a total of 58 repeats on individual plates in seven independent runs. Natural antibody response was analyzed in 35 sera from patients with HPV 16 DNA positive cervical intraepithelial neoplasia grade 2+ lesions. The new HT-PBNA is based on Gaussia luciferase with increased sensitivity compared to the previously described manual PBNA (manPBNA based on secreted alkaline phosphatase as reporter. Titers obtained with HT-PBNA were generally higher than titers obtained with the manPBNA. A good linear correlation (R(2 = 0.7 was found between HT-PBNA titers and anti-HPV 16 L1 antibody-levels determined by a Luminex bead-based GST-capture assay for these 35 sera and a Kappa-value of 0.72, with only 3 discordant sera in the low titer range. In addition to natural low titer antibody responses the high sensitivity of the HT-PBNA also allows detection of cross-neutralizing antibodies induced by commercial HPV L1-vaccines and experimental L2-vaccines. When analyzing the WHO international standards for HPV 16 and 18 we determined an analytical sensitivity of 0.864 and 1.105 mIU, respectively.

  4. An UPLC-MS/MS method for highly sensitive high-throughput analysis of phytohormones in plant tissues

    Directory of Open Access Journals (Sweden)

    Balcke Gerd Ulrich

    2012-11-01

    Full Text Available Abstract Background Phytohormones are the key metabolites participating in the regulation of multiple functions of plant organism. Among them, jasmonates, as well as abscisic and salicylic acids are responsible for triggering and modulating plant reactions targeted against pathogens and herbivores, as well as resistance to abiotic stress (drought, UV-irradiation and mechanical wounding. These factors induce dramatic changes in phytohormone biosynthesis and transport leading to rapid local and systemic stress responses. Understanding of underlying mechanisms is of principle interest for scientists working in various areas of plant biology. However, highly sensitive, precise and high-throughput methods for quantification of these phytohormones in small samples of plant tissues are still missing. Results Here we present an LC-MS/MS method for fast and highly sensitive determination of jasmonates, abscisic and salicylic acids. A single-step sample preparation procedure based on mixed-mode solid phase extraction was efficiently combined with essential improvements in mobile phase composition yielding higher efficiency of chromatographic separation and MS-sensitivity. This strategy resulted in dramatic increase in overall sensitivity, allowing successful determination of phytohormones in small (less than 50 mg of fresh weight tissue samples. The method was completely validated in terms of analyte recovery, sensitivity, linearity and precision. Additionally, it was cross-validated with a well-established GC-MS-based procedure and its applicability to a variety of plant species and organs was verified. Conclusion The method can be applied for the analyses of target phytohormones in small tissue samples obtained from any plant species and/or plant part relying on any commercially available (even less sensitive tandem mass spectrometry instrumentation.

  5. Fluorescent-magnetic dual-encoded nanospheres: a promising tool for fast-simultaneous-addressable high-throughput analysis

    Science.gov (United States)

    Xie, Min; Hu, Jun; Wen, Cong-Ying; Zhang, Zhi-Ling; Xie, Hai-Yan; Pang, Dai-Wen

    2012-01-01

    Bead-based optical encoding or magnetic encoding techniques are promising in high-throughput multiplexed detection and separation of numerous species under complicated conditions. Therefore, a self-assembly strategy implemented in an organic solvent is put forward to fabricate fluorescent-magnetic dual-encoded nanospheres. Briefly, hydrophobic trioctylphosphine oxide-capped CdSe/ZnS quantum dots (QDs) and oleic acid-capped nano-γ-Fe2O3 magnetic particles are directly, selectively and controllably assembled on branched poly(ethylene imine)-coated nanospheres without any pretreatment, which is crucial to keep the high quantum yield of QDs and good dispersibility of γ-Fe2O3. Owing to the tunability of coating amounts of QDs and γ-Fe2O3 as well as controllable fluorescent emissions of deposited-QDs, dual-encoded nanospheres with different photoluminescent emissions and gradient magnetic susceptibility are constructed. Using this improved layer-by-layer self-assembly approach, deposition of hydrophobic nanoparticles onto hydrophilic carriers in organic media can be easily realized; meanwhile, fluorescent-magnetic dual-functional nanospheres can be further equipped with readable optical and magnetic addresses. The resultant fluorescent-magnetic dual-encoded nanospheres possess both the unique optical properties of QDs and the superparamagnetic properties of γ-Fe2O3, exhibiting good monodispersibility, huge encoding capacity and nanoscale particle size. Compared with the encoded microbeads reported by others, the nanometre scale of the dual-encoded nanospheres gives them minimum steric hindrance and higher flexibility.

  6. Above Bonneville passage and propagation cost effectiveness analysis

    International Nuclear Information System (INIS)

    Paulsen, C.M.; Hyman, J.B.; Wernstedt, K.

    1993-05-01

    We have developed several models to evaluate the cost-effectiveness of alternative strategies to mitigate hydrosystem impacts on salmon and steelhead, and applied these models to areas of the Columbia River Basin. Our latest application evaluates the cost-effectiveness of proposed strategies that target mainstem survival (e.g., predator control, increases in water velocity) and subbasin propagation (e.g., habitat improvements, screening, hatchery production increases) for chinook salmon and steelhead stocks, in the portion of the Columbia Basin bounded by Bonneville, Chief Joseph, Dworshak, and Hells Canyon darns. At its core the analysis primarily considers financial cost and biological effectiveness, but we have included other attributes which may be of concern to the region

  7. Above Bonneville Passage and Propagation Cost Effectiveness Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, C.M.; Hyman, J.B.; Wernstedt, K.

    1993-05-01

    We have developed several models to evaluate the cost-effectiveness of alternative strategies to mitigate hydrosystem impacts on salmon and steelhead, and applied these models to areas of the Columbia River Basin. Our latest application evaluates the cost-effectiveness of proposed strategies that target mainstem survival (e.g., predator control, increases in water velocity) and subbasin propagation (e.g., habitat improvements, screening, hatchery production increases) for chinook salmon and steelhead stocks, in the portion of the Columbia Basin bounded by Bonneville, Chief Joseph, Dworshak, and Hells Canyon darns. At its core the analysis primarily considers financial cost and biological effectiveness, but we have included other attributes which may be of concern to the region.

  8. A Cost-effectiveness Analysis of Early vs Late Tracheostomy.

    Science.gov (United States)

    Liu, C Carrie; Rudmik, Luke

    2016-10-01

    The timing of tracheostomy in critically ill patients requiring mechanical ventilation is controversial. An important consideration that is currently missing in the literature is an evaluation of the economic impact of an early tracheostomy strategy vs a late tracheostomy strategy. To evaluate the cost-effectiveness of the early tracheostomy strategy vs the late tracheostomy strategy. This economic analysis was performed using a decision tree model with a 90-day time horizon. The economic perspective was that of the US health care third-party payer. The primary outcome was the incremental cost per tracheostomy avoided. Probabilities were obtained from meta-analyses of randomized clinical trials. Costs were obtained from the published literature and the Healthcare Cost and Utilization Project database. A multivariate probabilistic sensitivity analysis was performed to account for uncertainty surrounding mean values used in the reference case. The reference case demonstrated that the cost of the late tracheostomy strategy was $45 943.81 for 0.36 of effectiveness. The cost of the early tracheostomy strategy was $31 979.12 for 0.19 of effectiveness. The incremental cost-effectiveness ratio for the late tracheostomy strategy compared with the early tracheostomy strategy was $82 145.24 per tracheostomy avoided. With a willingness-to-pay threshold of $50 000, the early tracheostomy strategy is cost-effective with 56% certainty. The adaptation of an early vs a late tracheostomy strategy depends on the priorities of the decision-maker. Up to a willingness-to-pay threshold of $80 000 per tracheostomy avoided, the early tracheostomy strategy has a higher probability of being the more cost-effective intervention.

  9. High throughput screening of phenoxy carboxylic acids with dispersive solid phase extraction followed by direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Wang, Jiaqin; Zhu, Jun; Si, Ling; Du, Qi; Li, Hongli; Bi, Wentao; Chen, David Da Yong

    2017-12-15

    A high throughput, low environmental impact methodology for rapid determination of phenoxy carboxylic acids (PCAs) in water samples was developed by combing dispersive solid phase extraction (DSPE) using velvet-like graphitic carbon nitride (V-g-C 3 N 4 ) and direct analysis in real time mass spectrometry (DART-MS). Due to the large surface area and good dispersity of V-g-C 3 N 4 , the DSPE of PCAs in water was completed within 20 s, and the elution of PCAs was accomplished in 20 s as well using methanol. The eluents were then analyzed and quantified using DART ionization source coupled to a high resolution mass spectrometer, where an internal standard was added in the samples. The limit of detection ranged from 0.5 ng L -1 to 2 ng L -1 on the basis of 50 mL water sample; the recovery 79.9-119.1%; and the relative standard deviation 0.23%-9.82% (≥5 replicates). With the ease of use and speed of DART-MS, the whole protocol can complete within mere minutes, including sample preparation, extraction, elution, detection and quantitation. The methodology developed here is simple, fast, sensitive, quantitative, requiring little sample preparation and consuming significantly less toxic organic solvent, which can be used for high throughput screening of PCAs and potentially other contaminants in water. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  11. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  12. Cost-effectiveness analysis of treatments for vertebral compression fractures.

    Science.gov (United States)

    Edidin, Avram A; Ong, Kevin L; Lau, Edmund; Schmier, Jordana K; Kemner, Jason E; Kurtz, Steven M

    2012-07-01

    Vertebral compression fractures (VCFs) can be treated by nonsurgical management or by minimally invasive surgical treatment including vertebroplasty and balloon kyphoplasty. The purpose of the present study was to characterize the cost to Medicare for treating VCF-diagnosed patients by nonsurgical management, vertebroplasty, or kyphoplasty. We hypothesized that surgical treatments for VCFs using vertebroplasty or kyphoplasty would be a cost-effective alternative to nonsurgical management for the Medicare patient population. Cost per life-year gained for VCF patients in the US Medicare population was compared between operated (kyphoplasty and vertebroplasty) and non-operated patients and between kyphoplasty and vertebroplasty patients, all as a function of patient age and gender. Life expectancy was estimated using a parametric Weibull survival model (adjusted for comorbidities) for 858 978 VCF patients in the 100% Medicare dataset (2005-2008). Median payer costs were identified for each treatment group for up to 3 years following VCF diagnosis, based on 67 018 VCF patients in the 5% Medicare dataset (2005-2008). A discount rate of 3% was used for the base case in the cost-effectiveness analysis, with 0% and 5% discount rates used in sensitivity analyses. After accounting for the differences in median costs and using a discount rate of 3%, the cost per life-year gained for kyphoplasty and vertebroplasty patients ranged from $US1863 to $US6687 and from $US2452 to $US13 543, respectively, compared with non-operated patients. The cost per life-year gained for kyphoplasty compared with vertebroplasty ranged from -$US4878 (cost saving) to $US2763. Among patients for whom surgical treatment was indicated, kyphoplasty was found to be cost effective, and perhaps even cost saving, compared with vertebroplasty. Even for the oldest patients (85 years of age and older), both interventions would be considered cost effective in terms of cost per life-year gained.

  13. A Cost-Effectiveness Analysis Model for Evaluating and Planning Secondary Vocational Programs

    Science.gov (United States)

    Kim, Jin Eun

    1977-01-01

    This paper conceptualizes a cost-effectiveness analysis and describes a cost-effectiveness analysis model for secondary vocational programs. It generates three kinds of cost-effectiveness measures: program effectiveness, cost efficiency, and cost-effectiveness and/or performance ratio. (Author)

  14. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  15. Cost-effectiveness analysis of rotavirus vaccination in Argentina.

    Science.gov (United States)

    Urueña, Analía; Pippo, Tomás; Betelu, María Sol; Virgilio, Federico; Hernández, Laura; Giglio, Norberto; Gentile, Ángela; Diosque, Máximo; Vizzotti, Carla

    2015-05-07

    Rotavirus is a leading cause of severe diarrhea in children under 5. In Argentina, the most affected regions are the Northeast and Northwest, where hospitalizations and deaths are more frequent. This study estimated the cost-effectiveness of adding either of the two licensed rotavirus vaccines to the routine immunization schedule. The integrated TRIVAC vaccine cost-effectiveness model from the Pan American Health Organization's ProVac Initiative (Version 2.0) was used to assess health benefits, costs savings, life-years gained (LYGs), DALYs averted, and cost/DALY averted of vaccinating 10 successive cohorts, from the health care system and societal perspectives. Two doses of monovalent (RV1) rotavirus vaccine and three doses of pentavalent (RV5) rotavirus vaccine were each compared to a scenario assuming no vaccination. The price/dose was US$ 7.50 and US$ 5.15 for RV1 and RV5, respectively. We ran both a national and sub-national analysis, discounting all costs and benefits 3% annually. Our base case results were compared to a range of alternative univariate and multivariate scenarios. The number of LYGs was 5962 and 6440 for RV1 and RV5, respectively. The cost/DALY averted when compared to no vaccination from the health care system and societal perspective was: US$ 3870 and US$ 1802 for RV1, and US$ 2414 and US$ 358 for RV5, respectively. Equivalent figures for the Northeast were US$ 1470 and US$ 636 for RV1, and US$ 913 and US$ 80 for RV5. Therefore, rotavirus vaccination was more cost-effective in the Northeast compared to the whole country; and, in the Northwest, health service's costs saved outweighed the cost of introducing the vaccine. Vaccination with either vaccine compared to no vaccination was highly cost-effective based on WHO guidelines and Argentina's 2011 per capita GDP of US$ 9090. Key variables influencing results were vaccine efficacy, annual loss of efficacy, relative coverage of deaths, vaccine price, and discount rate. Compared to no

  16. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    Science.gov (United States)

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  17. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  18. On the censored cost-effectiveness analysis using copula information

    Directory of Open Access Journals (Sweden)

    Charles Fontaine

    2017-02-01

    Full Text Available Abstract Background Information and theory beyond copula concepts are essential to understand the dependence relationship between several marginal covariates distributions. In a therapeutic trial data scheme, most of the time, censoring occurs. That could lead to a biased interpretation of the dependence relationship between marginal distributions. Furthermore, it could result in a biased inference of the joint probability distribution function. A particular case is the cost-effectiveness analysis (CEA, which has shown its utility in many medico-economic studies and where censoring often occurs. Methods This paper discusses a copula-based modeling of the joint density and an estimation method of the costs, and quality adjusted life years (QALY in a cost-effectiveness analysis in case of censoring. This method is not based on any linearity assumption on the inferred variables, but on a punctual estimation obtained from the marginal distributions together with their dependence link. Results Our results show that the proposed methodology keeps only the bias resulting statistical inference and don’t have anymore a bias based on a unverified linearity assumption. An acupuncture study for chronic headache in primary care was used to show the applicability of the method and the obtained ICER keeps in the confidence interval of the standard regression methodology. Conclusion For the cost-effectiveness literature, such a technique without any linearity assumption is a progress since it does not need the specification of a global linear regression model. Hence, the estimation of the a marginal distributions for each therapeutic arm, the concordance measures between these populations and the right copulas families is now sufficient to process to the whole CEA.

  19. Gold nanoparticle-mediated (GNOME) laser perforation: a new method for a high-throughput analysis of gap junction intercellular coupling.

    Science.gov (United States)

    Begandt, Daniela; Bader, Almke; Antonopoulos, Georgios C; Schomaker, Markus; Kalies, Stefan; Meyer, Heiko; Ripken, Tammo; Ngezahayo, Anaclet

    2015-10-01

    The present report evaluates the advantages of using the gold nanoparticle-mediated laser perforation (GNOME LP) technique as a computer-controlled cell optoperforation to introduce Lucifer yellow (LY) into cells in order to analyze the gap junction coupling in cell monolayers. To permeabilize GM-7373 endothelial cells grown in a 24 multiwell plate with GNOME LP, a laser beam of 88 μm in diameter was applied in the presence of gold nanoparticles and LY. After 10 min to allow dye uptake and diffusion through gap junctions, we observed a LY-positive cell band of 179 ± 8 μm width. The presence of the gap junction channel blocker carbenoxolone during the optoperforation reduced the LY-positive band to 95 ± 6 μm. Additionally, a forskolin-related enhancement of gap junction coupling, recently found using the scrape loading technique, was also observed using GNOME LP. Further, an automatic cell imaging and a subsequent semi-automatic quantification of the images using a java-based ImageJ-plugin were performed in a high-throughput sequence. Moreover, the GNOME LP was used on cells such as RBE4 rat brain endothelial cells, which cannot be mechanically scraped as well as on three-dimensionally cultivated cells, opening the possibility to implement the GNOME LP technique for analysis of gap junction coupling in tissues. We conclude that the GNOME LP technique allows a high-throughput automated analysis of gap junction coupling in cells. Moreover this non-invasive technique could be used on monolayers that do not support mechanical scraping as well as on cells in tissue allowing an in vivo/ex vivo analysis of gap junction coupling.

  20. Synthetic Biomaterials to Rival Nature's Complexity-a Path Forward with Combinatorics, High-Throughput Discovery, and High-Content Analysis.

    Science.gov (United States)

    Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A

    2017-10-01

    Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. High-throughput and sensitive analysis of 3-monochloropropane-1,2-diol fatty acid esters in edible oils by supercritical fluid chromatography/tandem mass spectrometry.

    Science.gov (United States)

    Hori, Katsuhito; Matsubara, Atsuki; Uchikata, Takato; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2012-08-10

    We have established a high-throughput and sensitive analytical method based on supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry (QqQ MS) for 3-monochloropropane-1,2-diol (3-MCPD) fatty acid esters in edible oils. All analytes were successfully separated within 9 min without sample purification. The system was precise and sensitive, with a limit of detection less than 0.063 mg/kg. The recovery rate of 3-MCPD fatty acid esters spiked into oil samples was in the range of 62.68-115.23%. Furthermore, several edible oils were tested for analyzing 3-MCPD fatty acid ester profiles. This is the first report on the analysis of 3-MCPD fatty acid esters by SFC/QqQ MS. The developed method will be a powerful tool for investigating 3-MCPD fatty acid esters in edible oils. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Uncovering leaf rust responsive miRNAs in wheat (Triticum aestivum L.) using high-throughput sequencing and prediction of their targets through degradome analysis.

    Science.gov (United States)

    Kumar, Dhananjay; Dutta, Summi; Singh, Dharmendra; Prabhu, Kumble Vinod; Kumar, Manish; Mukhopadhyay, Kunal

    2017-01-01

    Deep sequencing identified 497 conserved and 559 novel miRNAs in wheat, while degradome analysis revealed 701 targets genes. QRT-PCR demonstrated differential expression of miRNAs during stages of leaf rust progression. Bread wheat (Triticum aestivum L.) is an important cereal food crop feeding 30 % of the world population. Major threat to wheat production is the rust epidemics. This study was targeted towards identification and functional characterizations of micro(mi)RNAs and their target genes in wheat in response to leaf rust ingression. High-throughput sequencing was used for transcriptome-wide identification of miRNAs and their expression profiling in retort to leaf rust using mock and pathogen-inoculated resistant and susceptible near-isogenic wheat plants. A total of 1056 mature miRNAs were identified, of which 497 miRNAs were conserved and 559 miRNAs were novel. The pathogen-inoculated resistant plants manifested more miRNAs compared with the pathogen infected susceptible plants. The miRNA counts increased in susceptible isoline due to leaf rust, conversely, the counts decreased in the resistant isoline in response to pathogenesis illustrating precise spatial tuning of miRNAs during compatible and incompatible interaction. Stem-loop quantitative real-time PCR was used to profile 10 highly differentially expressed miRNAs obtained from high-throughput sequencing data. The spatio-temporal profiling validated the differential expression of miRNAs between the isolines as well as in retort to pathogen infection. Degradome analysis provided 701 predicted target genes associated with defense response, signal transduction, development, metabolism, and transcriptional regulation. The obtained results indicate that wheat isolines employ diverse arrays of miRNAs that modulate their target genes during compatible and incompatible interaction. Our findings contribute to increase knowledge on roles of microRNA in wheat-leaf rust interactions and could help in rust

  3. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  4. Biphasic Study to Characterize Agricultural Biogas Plants by High-Throughput 16S rRNA Gene Amplicon Sequencing and Microscopic Analysis.

    Science.gov (United States)

    Maus, Irena; Kim, Yong Sung; Wibberg, Daniel; Stolze, Yvonne; Off, Sandra; Antonczyk, Sebastian; Pühler, Alfred; Scherer, Paul; Schlüter, Andreas

    2017-02-28

    Process surveillance within agricultural biogas plants (BGPs) was concurrently studied by high-throughput 16S rRNA gene amplicon sequencing and an optimized quantitative microscopic fingerprinting (QMF) technique. In contrast to 16S rRNA gene amplicons, digitalized microscopy is a rapid and cost-effective method that facilitates enumeration and morphological differentiation of the most significant groups of methanogens regarding their shape and characteristic autofluorescent factor 420. Moreover, the fluorescence signal mirrors cell vitality. In this study, four different BGPs were investigated. The results indicated stable process performance in the mesophilic BGPs and in the thermophilic reactor. Bacterial subcommunity characterization revealed significant differences between the four BGPs. Most remarkably, the genera Defluviitoga and Halocella dominated the thermophilic bacterial subcommunity, whereas members of another taxon, Syntrophaceticus , were found to be abundant in the mesophilic BGP. The domain Archaea was dominated by the genus Methanoculleus in all four BGPs, followed by Methanosaeta in BGP1 and BGP3. In contrast, Methanothermobacter members were highly abundant in the thermophilic BGP4. Furthermore, a high consistency between the sequencing approach and the QMF method was shown, especially for the thermophilic BGP. The differences elucidated that using this biphasic approach for mesophilic BGPs provided novel insights regarding disaggregated single cells of Methanosarcina and Methanosaeta species. Both dominated the archaeal subcommunity and replaced coccoid Methanoculleus members belonging to the same group of Methanomicrobiales that have been frequently observed in similar BGPs. This work demonstrates that combining QMF and 16S rRNA gene amplicon sequencing is a complementary strategy to describe archaeal community structures within biogas processes.

  5. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    Science.gov (United States)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  6. A simple, high throughput method to locate single copy sequences from Bacterial Artificial Chromosome (BAC libraries using High Resolution Melt analysis

    Directory of Open Access Journals (Sweden)

    Caligari Peter DS

    2010-05-01

    Full Text Available Abstract Background The high-throughput anchoring of genetic markers into contigs is required for many ongoing physical mapping projects. Multidimentional BAC pooling strategies for PCR-based screening of large insert libraries is a widely used alternative to high density filter hybridisation of bacterial colonies. To date, concerns over reliability have led most if not all groups engaged in high throughput physical mapping projects to favour BAC DNA isolation prior to amplification by conventional PCR. Results Here, we report the first combined use of Multiplex Tandem PCR (MT-PCR and High Resolution Melt (HRM analysis on bacterial stocks of BAC library superpools as a means of rapidly anchoring markers to BAC colonies and thereby to integrate genetic and physical maps. We exemplify the approach using a BAC library of the model plant Arabidopsis thaliana. Super pools of twenty five 384-well plates and two-dimension matrix pools of the BAC library were prepared for marker screening. The entire procedure only requires around 3 h to anchor one marker. Conclusions A pre-amplification step during MT-PCR allows high multiplexing and increases the sensitivity and reliability of subsequent HRM discrimination. This simple gel-free protocol is more reliable, faster and far less costly than conventional PCR screening. The option to screen in parallel 3 genetic markers in one MT-PCR-HRM reaction using templates from directly pooled bacterial stocks of BAC-containing bacteria further reduces time for anchoring markers in physical maps of species with large genomes.

  7. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  8. Cost-effectiveness analysis in minimally invasive spine surgery.

    Science.gov (United States)

    Al-Khouja, Lutfi T; Baron, Eli M; Johnson, J Patrick; Kim, Terrence T; Drazin, Doniel

    2014-06-01

    Medical care has been evolving with the increased influence of a value-based health care system. As a result, more emphasis is being placed on ensuring cost-effectiveness and utility in the services provided to patients. This study looks at this development in respect to minimally invasive spine surgery (MISS) costs. A literature review using PubMed, the Cost-Effectiveness Analysis (CEA) Registry, and the National Health Service Economic Evaluation Database (NHS EED) was performed. Papers were included in the study if they reported costs associated with minimally invasive spine surgery (MISS). If there was no mention of cost, CEA, cost-utility analysis (CUA), quality-adjusted life year (QALY), quality, or outcomes mentioned, then the article was excluded. Fourteen studies reporting costs associated with MISS in 12,425 patients (3675 undergoing minimally invasive procedures and 8750 undergoing open procedures) were identified through PubMed, the CEA Registry, and NHS EED. The percent cost difference between minimally invasive and open approaches ranged from 2.54% to 33.68%-all indicating cost saving with a minimally invasive surgical approach. Average length of stay (LOS) for minimally invasive surgery ranged from 0.93 days to 5.1 days compared with 1.53 days to 12 days for an open approach. All studies reporting EBL reported lower volume loss in an MISS approach (range 10-392.5 ml) than in an open approach (range 55-535.5 ml). There are currently an insufficient number of studies published reporting the costs of MISS. Of the studies published, none have followed a standardized method of reporting and analyzing cost data. Preliminary findings analyzing the 14 studies showed both cost saving and better outcomes in MISS compared with an open approach. However, more Level I CEA/CUA studies including cost/QALY evaluations with specifics of the techniques utilized need to be reported in a standardized manner to make more accurate conclusions on the cost effectiveness of

  9. Cost-effectiveness analysis of radon remediation in schools

    International Nuclear Information System (INIS)

    Kennedy, C.A.; Gray, A.M.

    2000-01-01

    Indoor radon is an important source of radiation dosage in the general population and has been recognised as a world-wide environmental and public health challenge. Governments in many Western and Eastern European and North American countries are undertaking active radon-risk reduction policies, including the remediation of existing residential and work place building stocks (1). These endeavours include a priority of remediating school buildings. Epidemiological and technical radon research has produced information which has enabled attention to be turned to specific effectiveness and optimisation questions regarding radon identification and remediation programmes in buildings, including schools. Decision making about policy implementation has been an integral part of these programmes and questions have been raised about the economic implications of the regulations and optimisation strategies for workplace action level policy (2,3). (the action level applied to schools is 400 Bq m -3 ). No previous study has estimated the cost-effectiveness of a radon remediation programme for schools using the methodological framework now considered appropriate in the economic evaluation of health interventions. It is imperative that this should be done, in order that the resources required to obtain health gain from radon remediation in schools can be systematically compared with equivalent data for other health interventions and radon remediation programmes. In this study a cost-effectiveness analysis of radon remediation in schools was undertaken, using the best available national data and information from Northamptonshire on the costs and effectiveness of radon identification and remediation in schools, and the costs and health impact of lung cancer cases. A model based on data from Northamptonshire is presented (where 6.3% of residential stock is over 200 Bq m -3 ). The resultant cost-effectiveness ratio was pound 7,550 per life year gained in pound 1997. Results from the

  10. 77 FR 1743 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2012-01-11

    ... OFFICE OF MANAGEMENT AND BUDGET Discount Rates for Cost-Effectiveness Analysis of Federal Programs... Appendix C are to be used for cost-effectiveness analysis, including lease-purchase analysis, as specified... (Revised December 2011) Discount Rates for Cost-Effectiveness, Lease Purchase, and Related Analyses...

  11. 76 FR 7881 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2011-02-11

    ... OFFICE OF MANAGEMENT AND BUDGET Discount Rates for Cost-Effectiveness Analysis of Federal Programs... Appendix C are to be used for cost-effectiveness analysis, including lease-purchase analysis, as specified... (Revised December 2010) DISCOUNT RATES FOR COST-EFFECTIVENESS, LEASE PURCHASE, AND RELATED ANALYSES...

  12. 78 FR 6140 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2013-01-29

    ... OFFICE OF MANAGEMENT AND BUDGET Discount Rates for Cost-Effectiveness Analysis of Federal Programs... in Appendix C are to be used for cost-effectiveness analysis, including lease-purchase analysis, as...) Discount Rates for Cost-Effectiveness, Lease Purchase, and Related Analyses Effective Dates. This appendix...

  13. Transcriptome-Wide Analysis of Botrytis elliptica Responsive microRNAs and Their Targets in Lilium Regale Wilson by High-Throughput Sequencing and Degradome Analysis

    Directory of Open Access Journals (Sweden)

    Xue Gao

    2017-05-01

    Full Text Available MicroRNAs, as master regulators of gene expression, have been widely identified and play crucial roles in plant-pathogen interactions. A fatal pathogen, Botrytis elliptica, causes the serious folia disease of lily, which reduces production because of the high susceptibility of most cultivated species. However, the miRNAs related to Botrytis infection of lily, and the miRNA-mediated gene regulatory networks providing resistance to B. elliptica in lily remain largely unexplored. To systematically dissect B. elliptica-responsive miRNAs and their target genes, three small RNA libraries were constructed from the leaves of Lilium regale, a promising Chinese wild Lilium species, which had been subjected to mock B. elliptica treatment or B. elliptica infection for 6 and 24 h. By high-throughput sequencing, 71 known miRNAs belonging to 47 conserved families and 24 novel miRNA were identified, of which 18 miRNAs were downreguleted and 13 were upregulated in response to B. elliptica. Moreover, based on the lily mRNA transcriptome, 22 targets for 9 known and 1 novel miRNAs were identified by the degradome sequencing approach. Most target genes for elliptica-responsive miRNAs were involved in metabolic processes, few encoding different transcription factors, including ELONGATION FACTOR 1 ALPHA (EF1a and TEOSINTE BRANCHED1/CYCLOIDEA/PROLIFERATING CELL FACTOR 2 (TCP2. Furthermore, the expression patterns of a set of elliptica-responsive miRNAs and their targets were validated by quantitative real-time PCR. This study represents the first transcriptome-based analysis of miRNAs responsive to B. elliptica and their targets in lily. The results reveal the possible regulatory roles of miRNAs and their targets in B. elliptica interaction, which will extend our understanding of the mechanisms of this disease in lily.

  14. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    Directory of Open Access Journals (Sweden)

    Krithika Bhuvaneshwar

    2015-01-01

    Full Text Available Next generation sequencing (NGS technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  15. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  16. ORIGINAL ARTICLES Cost-effectiveness analysis for priority-setting ...

    African Journals Online (AJOL)

    health outcomes and wasted resources.4-5 It was found that the cost- effectiveness of South ... Priorities for Developing Countries Project was that emergency (and even some elective) ... to control air pollutants found that in South Africa the most cost- effective ..... outdoor air pollution in South Africa in 2000. S Afr Med J ...

  17. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique : a review

    Directory of Open Access Journals (Sweden)

    Li eXiao

    2014-08-01

    Full Text Available Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR and pyrolysis-molecular beam mass spectrometry (Py-mbms are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis and for building regression models (partial least square regression between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated

  18. Systematic Analysis of the Association between Gut Flora and Obesity through High-Throughput Sequencing and Bioinformatics Approaches

    Directory of Open Access Journals (Sweden)

    Chih-Min Chiu

    2014-01-01

    Full Text Available Eighty-one stool samples from Taiwanese were collected for analysis of the association between the gut flora and obesity. The supervised analysis showed that the most, abundant genera of bacteria in normal samples (from people with a body mass index (BMI ≤ 24 were Bacteroides (27.7%, Prevotella (19.4%, Escherichia (12%, Phascolarctobacterium (3.9%, and Eubacterium (3.5%. The most abundant genera of bacteria in case samples (with a BMI ≥ 27 were Bacteroides (29%, Prevotella (21%, Escherichia (7.4%, Megamonas (5.1%, and Phascolarctobacterium (3.8%. A principal coordinate analysis (PCoA demonstrated that normal samples were clustered more compactly than case samples. An unsupervised analysis demonstrated that bacterial communities in the gut were clustered into two main groups: N-like and OB-like groups. Remarkably, most normal samples (78% were clustered in the N-like group, and most case samples (81% were clustered in the OB-like group (Fisher’s P  value=1.61E-07. The results showed that bacterial communities in the gut were highly associated with obesity. This is the first study in Taiwan to investigate the association between human gut flora and obesity, and the results provide new insights into the correlation of bacteria with the rising trend in obesity.

  19. Utility of lab-on-a-chip technology for high-throughput nucleic acid and protein analysis

    DEFF Research Database (Denmark)

    Hawtin, Paul; Hardern, Ian; Wittig, Rainer

    2005-01-01

    On-chip electrophoresis can provide size separations of nucleic acids and proteins similar to more traditional slab gel electrophoresis. Lab-on-a-chip (LoaC) systems utilize on-chip electrophoresis in conjunction with sizing calibration, sensitive detection schemes, and sophisticated data analysi...

  20. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  1. KUJIRA, a package of integrated modules for systematic and interactive analysis of NMR data directed to high-throughput NMR structure studies

    International Nuclear Information System (INIS)

    Kobayashi, Naohiro; Iwahara, Junji; Koshiba, Seizo; Tomizawa, Tadashi; Tochio, Naoya; Guentert, Peter; Kigawa, Takanori; Yokoyama, Shigeyuki

    2007-01-01

    The recent expansion of structural genomics has increased the demands for quick and accurate protein structure determination by NMR spectroscopy. The conventional strategy without an automated protocol can no longer satisfy the needs of high-throughput application to a large number of proteins, with each data set including many NMR spectra, chemical shifts, NOE assignments, and calculated structures. We have developed the new software KUJIRA, a package of integrated modules for the systematic and interactive analysis of NMR data, which is designed to reduce the tediousness of organizing and manipulating a large number of NMR data sets. In combination with CYANA, the program for automated NOE assignment and structure determination, we have established a robust and highly optimized strategy for comprehensive protein structure analysis. An application of KUJIRA in accordance with our new strategy was carried out by a non-expert in NMR structure analysis, demonstrating that the accurate assignment of the chemical shifts and a high-quality structure of a small protein can be completed in a few weeks. The high completeness of the chemical shift assignment and the NOE assignment achieved by the systematic analysis using KUJIRA and CYANA led, in practice, to increased reliability of the determined structure

  2. A flow cytometry-based method for a high-throughput analysis of drug-stabilized topoisomerase II cleavage complexes in human cells.

    Science.gov (United States)

    de Campos-Nebel, Marcelo; Palmitelli, Micaela; González-Cid, Marcela

    2016-09-01

    Topoisomerase II (Top2) is an important target for anticancer therapy. A variety of drugs that poison Top2, including several epipodophyllotoxins, anthracyclines, and anthracenediones, are widely used in the clinic for both hematologic and solid tumors. The poisoning of Top2 involves the formation of a reaction intermediate Top2-DNA, termed Top2 cleavage complex (Top2cc), which is persistent in the presence of the drug and involves a 5' end of DNA covalently bound to a tyrosine from the enzyme through a phosphodiester group. Drug-induced Top2cc leads to Top2 linked-DNA breaks which are the major responsible for their cytotoxicity. While biochemical detection is very laborious, quantification of drug-induced Top2cc by immunofluorescence-based microscopy techniques is time consuming and requires extensive image segmentation for the analysis of a small population of cells. Here, we developed a flow cytometry-based method for the analysis of drug-induced Top2cc. This method allows a rapid analysis of a high number of cells in their cell cycle phase context. Moreover, it can be applied to almost any human cell type, including clinical samples. The methodology is useful for a high-throughput analysis of drugs that poison Top2, allowing not just the discrimination of the Top2 isoform that is targeted but also to track its removal. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  3. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    Science.gov (United States)

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  4. High-Throughput Live-Cell Microscopy Analysis of Association Between Chromosome Domains and the Nucleolus in S. cerevisiae.

    Science.gov (United States)

    Wang, Renjie; Normand, Christophe; Gadal, Olivier

    2016-01-01

    Spatial organization of the genome has important impacts on all aspects of chromosome biology, including transcription, replication, and DNA repair. Frequent interactions of some chromosome domains with specific nuclear compartments, such as the nucleolus, are now well documented using genome-scale methods. However, direct measurement of distance and interaction frequency between loci requires microscopic observation of specific genomic domains and the nucleolus, followed by image analysis to allow quantification. The fluorescent repressor operator system (FROS) is an invaluable method to fluorescently tag DNA sequences and investigate chromosome position and dynamics in living cells. This chapter describes a combination of methods to define motion and region of confinement of a locus relative to the nucleolus in cell's nucleus, from fluorescence acquisition to automated image analysis using two dedicated pipelines.

  5. High-Throughput Analysis of Sucrose Fatty Acid Esters by Supercritical Fluid Chromatography/Tandem Mass Spectrometry

    Science.gov (United States)

    Hori, Katsuhito; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2014-01-01

    Supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry was applied to the profiling of sucrose fatty acid esters (SEs). The SFC conditions (column and modifier gradient) were optimized for the effective separation of SEs. In the column test, a silica gel reversed-phase column was selected. Then, the method was used for the detailed characterization of commercial SEs and the successful analysis of SEs containing different fatty acids. The present method allowed for fast and high-resolution separation of monoesters to tetra-esters within a shorter time (15 min) as compared to the conventional high-performance liquid chromatography. The applicability of our method for the analysis of SEs was thus demonstrated. PMID:26819875

  6. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    Energy Technology Data Exchange (ETDEWEB)

    Harding, Louisa B. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Schultz, Irvin R. [Battelle, Marine Sciences Laboratory – Pacific Northwest National Laboratory, 1529 West Sequim Bay Road, Sequim, WA 98382 (United States); Goetz, Giles W. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Luckenbach, J. Adam [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Young, Graham [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Goetz, Frederick W. [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, Manchester Research Station, P.O. Box 130, Manchester, WA 98353 (United States); Swanson, Penny, E-mail: penny.swanson@noaa.gov [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States)

    2013-10-15

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina{sup ®} sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  7. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    International Nuclear Information System (INIS)

    Harding, Louisa B.; Schultz, Irvin R.; Goetz, Giles W.; Luckenbach, J. Adam; Young, Graham; Goetz, Frederick W.; Swanson, Penny

    2013-01-01

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina ® sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  8. DNA-, RNA-, and Protein-Based Stable-Isotope Probing for High-Throughput Biomarker Analysis of Active Microorganisms.

    Science.gov (United States)

    Jameson, Eleanor; Taubert, Martin; Coyotzi, Sara; Chen, Yin; Eyice, Özge; Schäfer, Hendrik; Murrell, J Colin; Neufeld, Josh D; Dumont, Marc G

    2017-01-01

    Stable-isotope probing (SIP) enables researchers to target active populations within complex microbial communities, which is achieved by providing growth substrates enriched in heavy isotopes, usually in the form of 13 C, 18 O, or 15 N. After growth on the substrate and subsequent extraction of microbial biomarkers, typically nucleic acids or proteins, the SIP technique is used for the recovery and analysis of isotope-labeled biomarkers from active microbial populations. In the years following the initial development of DNA- and RNA-based SIP, it was common practice to characterize labeled populations by targeted gene analysis. Such approaches usually involved fingerprint-based analyses or sequencing of clone libraries containing 16S rRNA genes or functional marker gene amplicons. Although molecular fingerprinting remains a valuable approach for rapid confirmation of isotope labeling, recent advances in sequencing technology mean that it is possible to obtain affordable and comprehensive amplicon profiles, metagenomes, or metatranscriptomes from SIP experiments. Not only can the abundance of microbial groups be inferred from metagenomes, but researchers can bin, assemble, and explore individual genomes to build hypotheses about the metabolic capabilities of labeled microorganisms. Analysis of labeled mRNA is a more recent advance that can provide independent metatranscriptome-based analysis of active microorganisms. The power of metatranscriptomics is that mRNA abundance often correlates closely with the corresponding activity of encoded enzymes, thus providing insight into microbial metabolism at the time of sampling. Together, these advances have improved the sensitivity of SIP methods and allow the use of labeled substrates at ecologically relevant concentrations. Particularly as methods improve and costs continue to drop, we expect that the integration of SIP with multiple omics-based methods will become prevalent components of microbial ecology studies

  9. Helios: History and Anatomy of a Successful In-House Enterprise High-Throughput Screening and Profiling Data Analysis System.

    Science.gov (United States)

    Gubler, Hanspeter; Clare, Nicholas; Galafassi, Laurent; Geissler, Uwe; Girod, Michel; Herr, Guy

    2018-06-01

    We describe the main characteristics of the Novartis Helios data analysis software system (Novartis, Basel, Switzerland) for plate-based screening and profiling assays, which was designed and built about 11 years ago. It has been in productive use for more than 10 years and is one of the important standard software applications running for a large user community at all Novartis Institutes for BioMedical Research sites globally. A high degree of automation is reached by embedding the data analysis capabilities into a software ecosystem that deals with the management of samples, plates, and result data files, including automated data loading. The application provides a series of analytical procedures, ranging from very simple to advanced, which can easily be assembled by users in very flexible ways. This also includes the automatic derivation of a large set of quality control (QC) characteristics at every step. Any of the raw, intermediate, and final results and QC-relevant quantities can be easily explored through linked visualizations. Links to global assay metadata management, data warehouses, and an electronic lab notebook system are in place. Automated transfer of relevant data to data warehouses and electronic lab notebook systems are also implemented.

  10. High throughput, cell type-specific analysis of key proteins in human endometrial biopsies of women from fertile and infertile couples

    Science.gov (United States)

    Leach, Richard E.; Jessmon, Philip; Coutifaris, Christos; Kruger, Michael; Myers, Evan R.; Ali-Fehmi, Rouba; Carson, Sandra A.; Legro, Richard S.; Schlaff, William D.; Carr, Bruce R.; Steinkampf, Michael P.; Silva, Susan; Leppert, Phyllis C.; Giudice, Linda; Diamond, Michael P.; Armant, D. Randall

    2012-01-01

    BACKGROUND Although histological dating of endometrial biopsies provides little help for prediction or diagnosis of infertility, analysis of individual endometrial proteins, proteomic profiling and transcriptome analysis have suggested several biomarkers with altered expression arising from intrinsic abnormalities, inadequate stimulation by or in response to gonadal steroids or altered function due to systemic disorders. The objective of this study was to delineate the developmental dynamics of potentially important proteins in the secretory phase of the menstrual cycle, utilizing a collection of endometrial biopsies from women of fertile (n = 89) and infertile (n = 89) couples. METHODS AND RESULTS Progesterone receptor-B (PGR-B), leukemia inhibitory factor, glycodelin/progestagen-associated endometrial protein (PAEP), homeobox A10, heparin-binding EGF-like growth factor, calcitonin and chemokine ligand 14 (CXCL14) were measured using a high-throughput, quantitative immunohistochemical method. Significant cyclic and tissue-specific regulation was documented for each protein, as well as their dysregulation in women of infertile couples. Infertile patients demonstrated a delay early in the secretory phase in the decline of PGR-B (P localization provided important insights into the potential roles of these proteins in normal and pathological development of the endometrium that is not attainable from transcriptome analysis, establishing a basis for biomarker, diagnostic and targeted drug development for women with infertility. PMID:22215622

  11. Foundations of data-intensive science: Technology and practice for high throughput, widely distributed, data management and analysis systems

    Science.gov (United States)

    Johnston, William; Ernst, M.; Dart, E.; Tierney, B.

    2014-04-01

    Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large

  12. Network analysis of the microorganism in 25 Danish wastewater treatment plants over 7 years using high-throughput amplicon sequencing

    DEFF Research Database (Denmark)

    Albertsen, Mads; Larsen, Poul; Saunders, Aaron Marc

    to link sludge and floc properties to the microbial communities. All data was subjected to extensive network analysis and multivariate statistics through R. The 16S amplicon results confirmed the findings of relatively few core groups of organism shared by all the wastewater treatment plants......Wastewater treatment is the world’s largest biotechnological processes and a perfect model system for microbial ecology as the habitat is well defined and replicated all over the world. Extensive investigations on Danish wastewater treatment plants using fluorescent in situ hybridization have...... a year, totaling over 400 samples. All samples were subjected to 16S rDNA amplicon sequencing using V13 primers on the Illumina MiSeq platform (2x300bp) to a depth of at least 20.000 quality filtered reads per sample. The OTUs were assigned taxonomy based on a manually curated version of the greengenes...

  13. High throughput analysis reveals dissociable gene expression profiles in two independent neural systems involved in the regulation of social behavior

    Directory of Open Access Journals (Sweden)

    Stevenson Tyler J

    2012-10-01

    Full Text Available Abstract Background Production of contextually appropriate social behaviors involves integrated activity across many brain regions. Many songbird species produce complex vocalizations called ‘songs’ that serve to attract potential mates, defend territories, and/or maintain flock cohesion. There are a series of discrete interconnect brain regions that are essential for the successful production of song. The probability and intensity of singing behavior is influenced by the reproductive state. The objectives of this study were to examine the broad changes in gene expression in brain regions that control song production with a brain region that governs the reproductive state. Results We show using microarray cDNA analysis that two discrete brain systems that are both involved in governing singing behavior show markedly different gene expression profiles. We found that cortical and basal ganglia-like brain regions that control the socio-motor production of song in birds exhibit a categorical switch in gene expression that was dependent on their reproductive state. This pattern is in stark contrast to the pattern of expression observed in a hypothalamic brain region that governs the neuroendocrine control of reproduction. Subsequent gene ontology analysis revealed marked variation in the functional categories of active genes dependent on reproductive state and anatomical localization. HVC, one cortical-like structure, displayed significant gene expression changes associated with microtubule and neurofilament cytoskeleton organization, MAP kinase activity, and steroid hormone receptor complex activity. The transitions observed in the preoptic area, a nucleus that governs the motivation to engage in singing, exhibited variation in functional categories that included thyroid hormone receptor activity, epigenetic and angiogenetic processes. Conclusions These findings highlight the importance of considering the temporal patterns of gene expression

  14. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  15. Towards understanding of magnetization reversal in Nd-Fe-B nanocomposites: analysis by high-throughput micromagnetic simulations

    Science.gov (United States)

    Erokhin, Sergey; Berkov, Dmitry; Ito, Masaaki; Kato, Akira; Yano, Masao; Michels, Andreas

    2018-03-01

    We demonstrate how micromagnetic simulations can be employed in order to characterize and analyze the magnetic microstructure of nanocomposites. For the example of nanocrystalline Nd-Fe-B, which is a potential material for future permanent-magnet applications, we have compared three different models for the micromagnetic analysis of this material class: (i) a description of the nanocomposite microstructure in terms of Stoner-Wohlfarth particles with and without the magnetodipolar interaction; (ii) a model based on the core-shell representation of the nanograins; (iii) the latter model including a contribution of superparamagnetic clusters. The relevant parameter spaces have been systematically scanned with the aim to establish which micromagnetic approach can most adequately describe experimental data for this material. According to our results, only the last, most sophisticated model is able to provide an excellent agreement with the measured hysteresis loop. The presented methodology is generally applicable to multiphase magnetic nanocomposites and it highligths the complex interrelationship between the microstructure, magnetic interactions, and the macroscopic magnetic properties.

  16. High-throughput analysis by SP-LDI-MS for fast identification of adulterations in commercial balsamic vinegars

    Energy Technology Data Exchange (ETDEWEB)

    Guerreiro, Tatiane Melina; Oliveira, Diogo Noin de; Ferreira, Mônica Siqueira; Catharino, Rodrigo Ramos, E-mail: rrc@fcm.unicamp.br

    2014-08-01

    Highlights: • Rapid identification of adulteration in balsamic vinegars. • Minimal sample preparation. • No matrix required for assisting laser desorption/ionization. • Fast sample discrimination by multivariate data analysis. - Abstract: Balsamic vinegar (BV) is a typical and valuable Italian product, worldwide appreciated thanks to its characteristic flavors and potential health benefits. Several studies have been conducted to assess physicochemical and microbial compositions of BV, as well as its beneficial properties. Due to highly-disseminated claims of antioxidant, antihypertensive and antiglycemic properties, BV is a known target for frauds and adulterations. For that matter, product authentication, certifying its origin (region or country) and thus the processing conditions, is becoming a growing concern. Striving for fraud reduction as well as quality and safety assurance, reliable analytical strategies to rapidly evaluate BV quality are very interesting, also from an economical point of view. This work employs silica plate laser desorption/ionization mass spectrometry (SP-LDI-MS) for fast chemical profiling of commercial BV samples with protected geographical indication (PGI) and identification of its adulterated samples with low-priced vinegars, namely apple, alcohol and red/white wines.

  17. High-throughput analysis by SP-LDI-MS for fast identification of adulterations in commercial balsamic vinegars

    International Nuclear Information System (INIS)

    Guerreiro, Tatiane Melina; Oliveira, Diogo Noin de; Ferreira, Mônica Siqueira; Catharino, Rodrigo Ramos

    2014-01-01

    Highlights: • Rapid identification of adulteration in balsamic vinegars. • Minimal sample preparation. • No matrix required for assisting laser desorption/ionization. • Fast sample discrimination by multivariate data analysis. - Abstract: Balsamic vinegar (BV) is a typical and valuable Italian product, worldwide appreciated thanks to its characteristic flavors and potential health benefits. Several studies have been conducted to assess physicochemical and microbial compositions of BV, as well as its beneficial properties. Due to highly-disseminated claims of antioxidant, antihypertensive and antiglycemic properties, BV is a known target for frauds and adulterations. For that matter, product authentication, certifying its origin (region or country) and thus the processing conditions, is becoming a growing concern. Striving for fraud reduction as well as quality and safety assurance, reliable analytical strategies to rapidly evaluate BV quality are very interesting, also from an economical point of view. This work employs silica plate laser desorption/ionization mass spectrometry (SP-LDI-MS) for fast chemical profiling of commercial BV samples with protected geographical indication (PGI) and identification of its adulterated samples with low-priced vinegars, namely apple, alcohol and red/white wines

  18. Cost effectiveness analysis of indoor radon control measures

    International Nuclear Information System (INIS)

    Fujimoto, Kenzo

    1989-01-01

    The problem of radon 222 in buildings as a contributor to radiation exposure is described. Five different control methods and the dose reductions that would result from each are analysed. The annualized cost for each control measure was evaluated and the cost effectiveness of each control measure was calculated on the basis of dollars per person-sievert dose reduction. The use of unipolar ion generators for particle removal appears to be the most cost effective and the use of ceiling fans to increase air circulation the least cost effective. 3 figs., 1 tab

  19. Highly Sensitive and High-Throughput Method for the Analysis of Bisphenol Analogues and Their Halogenated Derivatives in Breast Milk.

    Science.gov (United States)

    Niu, Yumin; Wang, Bin; Zhao, Yunfeng; Zhang, Jing; Shao, Bing

    2017-12-06

    The structural analogs of bisphenol A (BPA) and their halogenated derivatives (together termed BPs) have been found in the environment, food, and even the human body. Limited research showed that some of them exhibited toxicities that were similar to or even greater than that of BPA. Therefore, adverse health effects for BPs were expected for humans with low-dose exposure in early life. Breast milk is an excellent matrix and could reflect fetuses' and babies' exposure to contaminants. Some of the emerging BPs may present with trace or ultratrace levels in humans. However, existing analytical methods for breast milk cannot quantify these BPs simultaneously with high sensitivity using a small sampling weight, which is important for human biomonitoring studies. In this paper, a method based on Bond Elut Enhanced Matrix Removal-Lipid purification, pyridine-3-sulfonyl chloride derivatization, and liquid chromatography electrospray tandem mass spectrometry was developed. The method requires only a small quantity of sample (200 μL) and allowed for the simultaneous determination of 24 BPs in breast milk with ultrahigh sensitivity. The limits of quantitation of the proposed method were 0.001-0.200 μg L -1 , which were 1-6.7 times lower than the only study for the simultaneous analysis of bisphenol analogs in breast milk based on a 3 g sample weight. The mean recoveries ranged from 86.11% to 119.05% with relative standard deviation (RSD) ≤ 19.5% (n = 6). Matrix effects were within 20% with RSD bisphenol F (BPF), bisphenol S (BPS), and bisphenol AF (BPAF) were detected. BPA was still the dominant BP, followed by BPF. This is the first report describing the occurrence of BPF and BPAF in breast milk.

  20. MG-RAST version 4-lessons learned from a decade of low-budget ultra-high-throughput metagenome analysis.

    Science.gov (United States)

    Meyer, Folker; Bagchi, Saurabh; Chaterji, Somali; Gerlach, Wolfgang; Grama, Ananth; Harrison, Travis; Paczian, Tobias; Trimble, William L; Wilke, Andreas

    2017-09-26

    As technologies change, MG-RAST is adapting. Newly available software is being included to improve accuracy and performance. As a computational service constantly running large volume scientific workflows, MG-RAST is the right location to perform benchmarking and implement algorithmic or platform improvements, in many cases involving trade-offs between specificity, sensitivity and run-time cost. The work in [Glass EM, Dribinsky Y, Yilmaz P, et al. ISME J 2014;8:1-3] is an example; we use existing well-studied data sets as gold standards representing different environments and different technologies to evaluate any changes to the pipeline. Currently, we use well-understood data sets in MG-RAST as platform for benchmarking. The use of artificial data sets for pipeline performance optimization has not added value, as these data sets are not presenting the same challenges as real-world data sets. In addition, the MG-RAST team welcomes suggestions for improvements of the workflow. We are currently working on versions 4.02 and 4.1, both of which contain significant input from the community and our partners that will enable double barcoding, stronger inferences supported by longer-read technologies, and will increase throughput while maintaining sensitivity by using Diamond and SortMeRNA. On the technical platform side, the MG-RAST team intends to support the Common Workflow Language as a standard to specify bioinformatics workflows, both to facilitate development and efficient high-performance implementation of the community's data analysis tasks. Published by Oxford University Press on behalf of Entomological Society of America 2017. This work is written by US Government employees and is in the public domain in the US.

  1. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    Science.gov (United States)

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without

  2. Comparative analysis of transcriptomes in aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing

    Directory of Open Access Journals (Sweden)

    Taketo Okada

    2016-12-01

    Full Text Available Ephedra plants are taxonomically classified as gymnosperms, and are medicinally important as the botanical origin of crude drugs and as bioresources that contain pharmacologically active chemicals. Here we show a comparative analysis of the transcriptomes of aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing by RNA-Seq. De novo assembly of short cDNA sequence reads generated 23,358, 13,373, and 28,579 contigs longer than 200 bases from aerial stems, roots, or both aerial stems and roots, respectively. The presumed functions encoded by these contig sequences were annotated by BLAST (blastx. Subsequently, these contigs were classified based on gene ontology slims, Enzyme Commission numbers, and the InterPro database. Furthermore, comparative gene expression analysis was performed between aerial stems and roots. These transcriptome analyses revealed differences and similarities between the transcriptomes of aerial stems and roots in E. sinica. Deep transcriptome sequencing of Ephedra should open the door to molecular biological studies based on the entire transcriptome, tissue- or organ-specific transcriptomes, or targeted genes of interest.

  3. Isolation of Exosome-Like Nanoparticles and Analysis of MicroRNAs Derived from Coconut Water Based on Small RNA High-Throughput Sequencing.

    Science.gov (United States)

    Zhao, Zhehao; Yu, Siran; Li, Min; Gui, Xin; Li, Ping

    2018-03-21

    In this study, the presence of microRNAs in coconut water was identified by real-time polymerase chain reaction (PCR) based on the results of high-throughput small RNA sequencing. In addition, the differences in microRNA content between immature and mature coconut water were compared. A total of 47 known microRNAs belonging to 25 families and 14 new microRNAs were identified in coconut endosperm. Through analysis using a target gene prediction software, potential microRNA target genes were identified in the human genome. Real-time PCR showed that the level of most microRNAs was higher in mature coconut water than in immature coconut water. Then, exosome-like nanoparticles were isolated from coconut water. After ultracentrifugation, some particle structures were seen in coconut water samples using 1,1'-dioctadecyl-3,3,3',3'-tetramethylindocarbocyanine perchlorate fluorescence staining. Subsequent scanning electron microscopy observation and dynamic light scattering analysis also revealed some exosome-like nanoparticles in coconut water, and the mean diameters of the particles detected by the two methods were 13.16 and 59.72 nm, respectively. In conclusion, there are extracellular microRNAs in coconut water, and their levels are higher in mature coconut water than in immature coconut water. Some exosome-like nanoparticles were isolated from coconut water, and the diameter of these particles was smaller than that of animal-derived exosomes.

  4. Identification and characterization of microRNAs related to salt stress in broccoli, using high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Tian, Yunhong; Tian, Yunming; Luo, Xiaojun; Zhou, Tao; Huang, Zuoping; Liu, Ying; Qiu, Yihan; Hou, Bing; Sun, Dan; Deng, Hongyu; Qian, Shen; Yao, Kaitai

    2014-09-03

    MicroRNAs (miRNAs) are a new class of endogenous regulators of a broad range of physiological processes, which act by regulating gene expression post-transcriptionally. The brassica vegetable, broccoli (Brassica oleracea var. italica), is very popular with a wide range of consumers, but environmental stresses such as salinity are a problem worldwide in restricting its growth and yield. Little is known about the role of miRNAs in the response of broccoli to salt stress. In this study, broccoli subjected to salt stress and broccoli grown under control conditions were analyzed by high-throughput sequencing. Differential miRNA expression was confirmed by real-time reverse transcription polymerase chain reaction (RT-PCR). The prediction of miRNA targets was undertaken using the Kyoto Encyclopedia of Genes and Genomes (KEGG) Orthology (KO) database and Gene Ontology (GO)-enrichment analyses. Two libraries of small (or short) RNAs (sRNAs) were constructed and sequenced by high-throughput Solexa sequencing. A total of 24,511,963 and 21,034,728 clean reads, representing 9,861,236 (40.23%) and 8,574,665 (40.76%) unique reads, were obtained for control and salt-stressed broccoli, respectively. Furthermore, 42 putative known and 39 putative candidate miRNAs that were differentially expressed between control and salt-stressed broccoli were revealed by their read counts and confirmed by the use of stem-loop real-time RT-PCR. Amongst these, the putative conserved miRNAs, miR393 and miR855, and two putative candidate miRNAs, miR3 and miR34, were the most strongly down-regulated when broccoli was salt-stressed, whereas the putative conserved miRNA, miR396a, and the putative candidate miRNA, miR37, were the most up-regulated. Finally, analysis of the predicted gene targets of miRNAs using the GO and KO databases indicated that a range of metabolic and other cellular functions known to be associated with salt stress were up-regulated in broccoli treated with salt. A comprehensive

  5. Parallel workflow for high-throughput (>1,000 samples/day quantitative analysis of human insulin-like growth factor 1 using mass spectrometric immunoassay.

    Directory of Open Access Journals (Sweden)

    Paul E Oran

    Full Text Available Insulin-like growth factor 1 (IGF1 is an important biomarker for the management of growth hormone disorders. Recently there has been rising interest in deploying mass spectrometric (MS methods of detection for measuring IGF1. However, widespread clinical adoption of any MS-based IGF1 assay will require increased throughput and speed to justify the costs of analyses, and robust industrial platforms that are reproducible across laboratories. Presented here is an MS-based quantitative IGF1 assay with performance rating of >1,000 samples/day, and a capability of quantifying IGF1 point mutations and posttranslational modifications. The throughput of the IGF1 mass spectrometric immunoassay (MSIA benefited from a simplified sample preparation step, IGF1 immunocapture in a tip format, and high-throughput MALDI-TOF MS analysis. The Limit of Detection and Limit of Quantification of the resulting assay were 1.5 μg/L and 5 μg/L, respectively, with intra- and inter-assay precision CVs of less than 10%, and good linearity and recovery characteristics. The IGF1 MSIA was benchmarked against commercially available IGF1 ELISA via Bland-Altman method comparison test, resulting in a slight positive bias of 16%. The IGF1 MSIA was employed in an optimized parallel workflow utilizing two pipetting robots and MALDI-TOF-MS instruments synced into one-hour phases of sample preparation, extraction and MSIA pipette tip elution, MS data collection, and data processing. Using this workflow, high-throughput IGF1 quantification of 1,054 human samples was achieved in approximately 9 hours. This rate of assaying is a significant improvement over existing MS-based IGF1 assays, and is on par with that of the enzyme-based immunoassays. Furthermore, a mutation was detected in ∼1% of the samples (SNP: rs17884626, creating an A→T substitution at position 67 of the IGF1, demonstrating the capability of IGF1 MSIA to detect point mutations and posttranslational modifications.

  6. Optimization of a Differential Ion Mobility Spectrometry-Tandem Mass Spectrometry Method for High-Throughput Analysis of Nicotine and Related Compounds: Application to Electronic Cigarette Refill Liquids.

    Science.gov (United States)

    Regueiro, Jorge; Giri, Anupam; Wenzl, Thomas

    2016-06-21

    Fast market penetration of electronic cigarettes is leading to an exponentially growing number of electronic refill liquids with different nicotine contents and an endless list of flavors. Therefore, rapid and simple methods allowing a fast screening of these products are necessary to detect harmful substances which can negatively impact the health of consumers. In this regard, the present work explores the capabilities of differential ion mobility spectrometry coupled to tandem mass spectrometry for high-throughput analysis of nicotine and 11 related compounds in commercial refill liquids for electronic cigarettes. The influence of main factors affecting the ion mobility separation, such as modifier types and concentration, separation voltage, and temperature, was systematically investigated. Despite small molecular weight differences among the studied compounds, a good separation was achieved in the ion mobility cell under the optimized conditions, which involved the use of ethanol as a polar gas-phase chemical modifier. Indeed, differential ion mobility was able to resolve (resolution >4) nicotine from its structural isomer anabasine without the use of any chromatographic separation. The quantitative performance of the proposed method was then evaluated, showing satisfactory precision (RSD ≤ 16%) and recoveries ranging from 85 to 100% for nicotine, and from 84 to 126% for the rest of the target analytes. Several commercial electronic cigarette refill liquids were analyzed to demonstrate the applicability of the method. In some cases, significant differences were found between labeled and measured levels of nicotine. Anatabine, cotinine, myosmine, and nornicotine were also found in some of the analyzed samples.

  7. High-throughput flow injection analysis mass spectroscopy with networked delivery of color-rendered results. 2. Three-dimensional spectral mapping of 96-well combinatorial chemistry racks.

    Science.gov (United States)

    Görlach, E; Richmond, R; Lewis, I

    1998-08-01

    For the last two years, the mass spectroscopy section of the Novartis Pharma Research Core Technology group has analyzed tens of thousands of multiple parallel synthesis samples from the Novartis Pharma Combinatorial Chemistry program, using an in-house developed automated high-throughput flow injection analysis electrospray ionization mass spectroscopy system. The electrospray spectra of these samples reflect the many structures present after the cleavage step from the solid support. The overall success of the sequential synthesis is mirrored in the purity of the expected end product, but the partial success of individual synthesis steps is evident in the impurities in the mass spectrum. However this latter reaction information, which is of considerable utility to the combinatorial chemist, is effectively hidden from view by the very large number of analyzed samples. This information is now revealed at the workbench of the combinatorial chemist by a novel three-dimensional display of each rack's complete mass spectral ion current using the in-house RackViewer Visual Basic application. Colorization of "forbidden loss" and "forbidden gas-adduct" zones, normalization to expected monoisotopic molecular weight, colorization of ionization intensity, and sorting by row or column were used in combination to highlight systematic patterns in the mass spectroscopy data.

  8. Genome-wide identification and comparative analysis of grafting-responsive mRNA in watermelon grafted onto bottle gourd and squash rootstocks by high-throughput sequencing.

    Science.gov (United States)

    Liu, Na; Yang, Jinghua; Fu, Xinxing; Zhang, Li; Tang, Kai; Guy, Kateta Malangisha; Hu, Zhongyuan; Guo, Shaogui; Xu, Yong; Zhang, Mingfang

    2016-04-01

    Grafting is an important agricultural technique widely used to improve plant growth, yield, and adaptation to either biotic or abiotic stresses. However, the molecular mechanisms underlying grafting-induced physiological processes remain unclear. Watermelon (Citrullus lanatus L.) is an important horticultural crop worldwide. Grafting technique is commonly used in watermelon production for improving its tolerance to stresses, especially to the soil-borne fusarium wilt disease. In the present study, we used high-throughput sequencing to perform a genome-wide transcript analysis of scions from watermelon grafted onto bottle gourd and squash rootstocks. Our transcriptome and digital gene expression (DGE) profiling data provided insights into the molecular aspects of gene regulation in grafted watermelon. Compared with self-grafted watermelon, there were 787 and 3485 genes differentially expressed in watermelon grafted onto bottle gourd and squash rootstocks, respectively. These genes were associated with primary and secondary metabolism, hormone signaling, transcription factors, transporters, and response to stimuli. Grafting led to changes in expression of these genes, suggesting that they may play important roles in mediating the physiological processes of grafted seedlings. The potential roles of the grafting-responsive mRNAs in diverse biological and metabolic processes were discussed. Obviously, the data obtained in this study provide an excellent resource for unraveling the mechanisms of candidate genes function in diverse biological processes and in environmental adaptation in a graft system.

  9. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  10. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  11. Cost-effectiveness analysis of treatments for premenstrual dysphoric disorder.

    Science.gov (United States)

    Rendas-Baum, Regina; Yang, Min; Gricar, Joseph; Wallenstein, Gene V

    2010-01-01

    Premenstrual syndrome (PMS) is reported to affect between 13% and 31% of women. Between 3% and 8% of women are reported to meet criteria for the more severe form of PMS, premenstrual dysphoric disorder (PMDD). Although PMDD has received increased attention in recent years, the cost effectiveness of treatments for PMDD remains unknown. To evaluate the cost effectiveness of the four medications with a US FDA-approved indication for PMDD: fluoxetine, sertraline, paroxetine and drospirenone plus ethinyl estradiol (DRSP/EE). A decision-analytic model was used to evaluate both direct costs (medication and physician visits) and clinical outcomes (treatment success, failure and discontinuation). Medication costs were based on average wholesale prices of branded products; physician visit costs were obtained from a claims database study of PMDD patients and the Agency for Healthcare Research and Quality. Clinical outcome probabilities were derived from published clinical trials in PMDD. The incremental cost-effectiveness ratio (ICER) was calculated using the difference in costs and percentage of successfully treated patients at 6 months. Deterministic and probabilistic sensitivity analyses were used to assess the impact of uncertainty in parameter estimates. Threshold values where a change in the cost-effective strategy occurred were identified using a net benefit framework. Starting therapy with DRSP/EE dominated both sertraline and paroxetine, but not fluoxetine. The estimated ICER of initiating treatment with fluoxetine relative to DRSP/EE was $US4385 per treatment success (year 2007 values). Cost-effectiveness acceptability curves revealed that for ceiling ratios>or=$US3450 per treatment success, fluoxetine had the highest probability (>or=0.37) of being the most cost-effective treatment, relative to the other options. The cost-effectiveness acceptability frontier further indicated that DRSP/EE remained the option with the highest expected net monetary benefit for

  12. Close-range hyperspectral image analysis for the early detection of stress responses in individual plants in a high-throughput phenotyping platform

    Science.gov (United States)

    Mohd Asaari, Mohd Shahrimie; Mishra, Puneet; Mertens, Stien; Dhondt, Stijn; Inzé, Dirk; Wuyts, Nathalie; Scheunders, Paul

    2018-04-01

    The potential of close-range hyperspectral imaging (HSI) as a tool for detecting early drought stress responses in plants grown in a high-throughput plant phenotyping platform (HTPPP) was explored. Reflectance spectra from leaves in close-range imaging are highly influenced by plant geometry and its specific alignment towards the imaging system. This induces high uninformative variability in the recorded signals, whereas the spectral signature informing on plant biological traits remains undisclosed. A linear reflectance model that describes the effect of the distance and orientation of each pixel of a plant with respect to the imaging system was applied. By solving this model for the linear coefficients, the spectra were corrected for the uninformative illumination effects. This approach, however, was constrained by the requirement of a reference spectrum, which was difficult to obtain. As an alternative, the standard normal variate (SNV) normalisation method was applied to reduce this uninformative variability. Once the envisioned illumination effects were eliminated, the remaining differences in plant spectra were assumed to be related to changes in plant traits. To distinguish the stress-related phenomena from regular growth dynamics, a spectral analysis procedure was developed based on clustering, a supervised band selection, and a direct calculation of a spectral similarity measure against a reference. To test the significance of the discrimination between healthy and stressed plants, a statistical test was conducted using a one-way analysis of variance (ANOVA) technique. The proposed analysis techniques was validated with HSI data of maize plants (Zea mays L.) acquired in a HTPPP for early detection of drought stress in maize plant. Results showed that the pre-processing of reflectance spectra with the SNV effectively reduces the variability due to the expected illumination effects. The proposed spectral analysis method on the normalized spectra successfully

  13. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  14. Immunoglobulin G (IgG) Fab glycosylation analysis using a new mass spectrometric high-throughput profiling method reveals pregnancy-associated changes.

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H J; Hensbergen, Paul J; Reiding, Karli R; Hazes, Johanna M W; Dolhain, Radboud J E M; Wuhrer, Manfred

    2014-11-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼ 20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  15. Immunoglobulin G (IgG) Fab Glycosylation Analysis Using a New Mass Spectrometric High-throughput Profiling Method Reveals Pregnancy-associated Changes*

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H. J.; Hensbergen, Paul J.; Reiding, Karli R.; Hazes, Johanna M. W.; Dolhain, Radboud J. E. M.; Wuhrer, Manfred

    2014-01-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding. PMID:25004930

  16. Comprehensive processing of high-throughput small RNA sequencing data including quality checking, normalization, and differential expression analysis using the UEA sRNA Workbench.

    Science.gov (United States)

    Beckers, Matthew; Mohorianu, Irina; Stocks, Matthew; Applegate, Christopher; Dalmay, Tamas; Moulton, Vincent

    2017-06-01

    Recently, high-throughput sequencing (HTS) has revealed compelling details about the small RNA (sRNA) population in eukaryotes. These 20 to 25 nt noncoding RNAs can influence gene expression by acting as guides for the sequence-specific regulatory mechanism known as RNA silencing. The increase in sequencing depth and number of samples per project enables a better understanding of the role sRNAs play by facilitating the study of expression patterns. However, the intricacy of the biological hypotheses coupled with a lack of appropriate tools often leads to inadequate mining of the available data and thus, an incomplete description of the biological mechanisms involved. To enable a comprehensive study of differential expression in sRNA data sets, we present a new interactive pipeline that guides researchers through the various stages of data preprocessing and analysis. This includes various tools, some of which we specifically developed for sRNA analysis, for quality checking and normalization of sRNA samples as well as tools for the detection of differentially expressed sRNAs and identification of the resulting expression patterns. The pipeline is available within the UEA sRNA Workbench, a user-friendly software package for the processing of sRNA data sets. We demonstrate the use of the pipeline on a H. sapiens data set; additional examples on a B. terrestris data set and on an A. thaliana data set are described in the Supplemental Information A comparison with existing approaches is also included, which exemplifies some of the issues that need to be addressed for sRNA analysis and how the new pipeline may be used to do this. © 2017 Beckers et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  17. Improvement of High-throughput Genotype Analysis After Implementation of a Dual-curve Sybr Green I-based Quantification and Normalization Procedure

    Science.gov (United States)

    The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...

  18. Cost-Effectiveness Analysis of Regorafenib for Metastatic Colorectal Cancer.

    Science.gov (United States)

    Goldstein, Daniel A; Ahmad, Bilal B; Chen, Qiushi; Ayer, Turgay; Howard, David H; Lipscomb, Joseph; El-Rayes, Bassel F; Flowers, Christopher R

    2015-11-10

    Regorafenib is a standard-care option for treatment-refractory metastatic colorectal cancer that increases median overall survival by 6 weeks compared with placebo. Given this small incremental clinical benefit, we evaluated the cost-effectiveness of regorafenib in the third-line setting for patients with metastatic colorectal cancer from the US payer perspective. We developed a Markov model to compare the cost and effectiveness of regorafenib with those of placebo in the third-line treatment of metastatic colorectal cancer. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Drug costs were based on Medicare reimbursement rates in 2014. Model robustness was addressed in univariable and probabilistic sensitivity analyses. Regorafenib provided an additional 0.04 QALYs (0.13 life-years) at a cost of $40,000, resulting in an incremental cost-effectiveness ratio of $900,000 per QALY. The incremental cost-effectiveness ratio for regorafenib was > $550,000 per QALY in all of our univariable and probabilistic sensitivity analyses. Regorafenib provides minimal incremental benefit at high incremental cost per QALY in the third-line management of metastatic colorectal cancer. The cost-effectiveness of regorafenib could be improved by the use of value-based pricing. © 2015 by American Society of Clinical Oncology.

  19. Cost-effectiveness analysis of infant feeding strategies to prevent ...

    African Journals Online (AJOL)

    Changing feeding practices is beneficial, depending on context. Breastfeeding is dominant (less costly, more effective) in rural settings, whilst formula feeding is a dominant strategy in urban settings. Cost-effectiveness was most sensitive to proportion of women on lifelong antiretroviral therapy (ART) and infant mortality rate ...

  20. Cost-Effectiveness Analysis of Unsafe Abortion and Alternative First ...

    African Journals Online (AJOL)

    To explore the policy implications of increasing access to safe abortion in Nigeria and Ghana, we developed a computer-based decision analytic model which simulates induced abortion and its potential complications in a cohort of women, and comparatively assessed the cost-effectiveness of unsafe abortion and three ...

  1. Cost-Effectiveness Analysis of Family Planning Services Offered by ...

    African Journals Online (AJOL)

    USER

    Keywords: Mobile clinics; Staic clinic; Family planning; Cost-effectiveness. Résumé. Des analyses ... d'offrir plus de méthodes de longue durée d'action peut influer sur une décision politique entre ces options. Cliniques ... nurse and a driver9.

  2. Systemic cost-effectiveness analysis of food hazard reduction

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Lawson, Lartey Godwin; Lund, Mogens

    2015-01-01

    stage are considered. Cost analyses are conducted for different risk reduction targets and for three alternative scenarios concerning the acceptable range of interventions. Results demonstrate that using a system-wide policy approach to risk reduction can be more cost-effective than a policy focusing...

  3. iMir: an integrated pipeline for high-throughput analysis of small non-coding RNA data obtained by smallRNA-Seq.

    Science.gov (United States)

    Giurato, Giorgio; De Filippo, Maria Rosaria; Rinaldi, Antonio; Hashim, Adnan; Nassa, Giovanni; Ravo, Maria; Rizzo, Francesca; Tarallo, Roberta; Weisz, Alessandro

    2013-12-13

    RNAs. In addition, iMir allowed also the identification of ~70 piRNAs (piwi-interacting RNAs), some of which differentially expressed in proliferating vs growth arrested cells. The integrated data analysis pipeline described here is based on a reliable, flexible and fully automated workflow, useful to rapidly and efficiently analyze high-throughput smallRNA-Seq data, such as those produced by the most recent high-performance next generation sequencers. iMir is available at http://www.labmedmolge.unisa.it/inglese/research/imir.

  4. Accounting for Cured Patients in Cost-Effectiveness Analysis.

    Science.gov (United States)

    Othus, Megan; Bansal, Aasthaa; Koepl, Lisel; Wagner, Samuel; Ramsey, Scott

    2017-04-01

    Economic evaluations often measure an intervention effect with mean overall survival (OS). Emerging types of cancer treatments offer the possibility of being "cured" in that patients can become long-term survivors whose risk of death is the same as that of a disease-free person. Describing cured and noncured patients with one shared mean value may provide a biased assessment of a therapy with a cured proportion. The purpose of this article is to explain how to incorporate the heterogeneity from cured patients into health economic evaluation. We analyzed clinical trial data from patients with advanced melanoma treated with ipilimumab (Ipi; n = 137) versus glycoprotein 100 (gp100; n = 136) with statistical methodology for mixture cure models. Both cured and noncured patients were subject to background mortality not related to cancer. When ignoring cured proportions, we found that patients treated with Ipi had an estimated mean OS that was 8 months longer than that of patients treated with gp100. Cure model analysis showed that the cured proportion drove this difference, with 21% cured on Ipi versus 6% cured on gp100. The mean OS among the noncured cohort patients was 10 and 9 months with Ipi and gp100, respectively. The mean OS among cured patients was 26 years on both arms. When ignoring cured proportions, we found that the incremental cost-effectiveness ratio (ICER) when comparing Ipi with gp100 was $324,000/quality-adjusted life-year (QALY) (95% confidence interval $254,000-$600,000). With a mixture cure model, the ICER when comparing Ipi with gp100 was $113,000/QALY (95% confidence interval $101,000-$154,000). This analysis supports using cure modeling in health economic evaluation in advanced melanoma. When a proportion of patients may be long-term survivors, using cure models may reduce bias in OS estimates and provide more accurate estimates of health economic measures, including QALYs and ICERs. Copyright © 2017 International Society for Pharmacoeconomics

  5. Analysis of the effects of five factors relevant to in vitro chondrogenesis of human mesenchymal stem cells using factorial design and high throughput mRNA-profiling.

    Science.gov (United States)

    Jakobsen, Rune B; Østrup, Esben; Zhang, Xiaolan; Mikkelsen, Tarjei S; Brinchmann, Jan E

    2014-01-01

    The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols.

  6. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data [version 3; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Damien Correia

    2016-12-01

    Full Text Available The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS or Next-Generation Sequencing (NGS technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS, solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power. Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration

  7. AlphaScreen-based homogeneous assay using a pair of 25-residue artificial proteins for high-throughput analysis of non-native IgG.

    Science.gov (United States)

    Senga, Yukako; Imamura, Hiroshi; Miyafusa, Takamitsu; Watanabe, Hideki; Honda, Shinya

    2017-09-29

    Therapeutic IgG becomes unstable under various stresses in the manufacturing process. The resulting non-native IgG molecules tend to associate with each other and form aggregates. Because such aggregates not only decrease the pharmacological effect but also become a potential risk factor for immunogenicity, rapid analysis of aggregation is required for quality control of therapeutic IgG. In this study, we developed a homogeneous assay using AlphaScreen and AF.2A1. AF.2A1 is a 25-residue artificial protein that binds specifically to non-native IgG generated under chemical and physical stresses. This assay is performed in a short period of time. Our results show that AF.2A1-AlphaScreen may be used to evaluate the various types of IgG, as AF.2A1 recognizes the non-native structure in the constant region (Fc region) of IgG. The assay was effective for detection of non-native IgG, with particle size up to ca. 500 nm, generated under acid, heat, and stirring conditions. In addition, this technique is suitable for analyzing non-native IgG in CHO cell culture supernatant and mixed with large amounts of native IgG. These results indicate the potential of AF.2A1-AlphaScreen to be used as a high-throughput evaluation method for process monitoring as well as quality testing in the manufacturing of therapeutic IgG.

  8. Mapping whole-brain activity with cellular resolution by light-sheet microscopy and high-throughput image analysis (Conference Presentation)

    Science.gov (United States)

    Silvestri, Ludovico; Rudinskiy, Nikita; Paciscopi, Marco; Müllenbroich, Marie Caroline; Costantini, Irene; Sacconi, Leonardo; Frasconi, Paolo; Hyman, Bradley T.; Pavone, Francesco S.

    2016-03-01

    Mapping neuronal activity patterns across the whole brain with cellular resolution is a challenging task for state-of-the-art imaging methods. Indeed, despite a number of technological efforts, quantitative cellular-resolution activation maps of the whole brain have not yet been obtained. Many techniques are limited by coarse resolution or by a narrow field of view. High-throughput imaging methods, such as light sheet microscopy, can be used to image large specimens with high resolution and in reasonable times. However, the bottleneck is then moved from image acquisition to image analysis, since many TeraBytes of data have to be processed to extract meaningful information. Here, we present a full experimental pipeline to quantify neuronal activity in the entire mouse brain with cellular resolution, based on a combination of genetics, optics and computer science. We used a transgenic mouse strain (Arc-dVenus mouse) in which neurons which have been active in the last hours before brain fixation are fluorescently labelled. Samples were cleared with CLARITY and imaged with a custom-made confocal light sheet microscope. To perform an automatic localization of fluorescent cells on the large images produced, we used a novel computational approach called semantic deconvolution. The combined approach presented here allows quantifying the amount of Arc-expressing neurons throughout the whole mouse brain. When applied to cohorts of mice subject to different stimuli and/or environmental conditions, this method helps finding correlations in activity between different neuronal populations, opening the possibility to infer a sort of brain-wide 'functional connectivity' with cellular resolution.

  9. Analysis of the Effects of Five Factors Relevant to In Vitro Chondrogenesis of Human Mesenchymal Stem Cells Using Factorial Design and High Throughput mRNA-Profiling

    Science.gov (United States)

    Jakobsen, Rune B.; Østrup, Esben; Zhang, Xiaolan; Mikkelsen, Tarjei S.; Brinchmann, Jan E.

    2014-01-01

    The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols. PMID:24816923

  10. Analysis of the effects of five factors relevant to in vitro chondrogenesis of human mesenchymal stem cells using factorial design and high throughput mRNA-profiling.

    Directory of Open Access Journals (Sweden)

    Rune B Jakobsen

    Full Text Available The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols.

  11. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    Science.gov (United States)

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015

  12. Cost-effectiveness analysis of optimal strategy for tumor treatment

    International Nuclear Information System (INIS)

    Pang, Liuyong; Zhao, Zhong; Song, Xinyu

    2016-01-01

    We propose and analyze an antitumor model with combined immunotherapy and chemotherapy. Firstly, we explore the treatment effects of single immunotherapy and single chemotherapy, respectively. Results indicate that neither immunotherapy nor chemotherapy alone are adequate to cure a tumor. Hence, we apply optimal theory to investigate how the combination of immunotherapy and chemotherapy should be implemented, for a certain time period, in order to reduce the number of tumor cells, while minimizing the implementation cost of the treatment strategy. Secondly, we establish the existence of the optimality system and use Pontryagin’s Maximum Principle to characterize the optimal levels of the two treatment measures. Furthermore, we calculate the incremental cost-effectiveness ratios to analyze the cost-effectiveness of all possible combinations of the two treatment measures. Finally, numerical results show that the combination of immunotherapy and chemotherapy is the most cost-effective strategy for tumor treatment, and able to eliminate the entire tumor with size 4.470 × 10"8 in a year.

  13. A cost-effectiveness analysis of shipboard telemedicine.

    Science.gov (United States)

    Stoloff, P H; Garcia, F E; Thomason, J E; Shia, D S

    1998-01-01

    The U.S. Navy is considering the installation of telemedicine equipment on more than 300 ships. Besides improving the quality of care, benefits would arise from avoiding medical evacuations (MEDEVACs) and returning patients to work more quickly. Because telemedicine has not yet been fully implemented by the Navy, we relied on projections of anticipated savings and costs, rather than actual expenditures, to determine cost-effectiveness. To determine the demand for telemedicine and the cost-effectiveness of various technologies (telephone and fax, e-mail and Internet, video teleconferencing (VTC), teleradiology, and diagnostic instruments), as well as their bandwidth requirements. A panel of Navy medical experts with telemedicine experience reviewed a representative sample of patient visits collected over a 1-year period and estimated the man-day savings and quality-of-care enhancements that might have occurred had telemedicine technologies been available. The savings from potentially avoiding MEDEVACs was estimated from a survey of ships' medical staff. These sample estimates were then projected to the medical workload of the entire fleet. Off-the-shelf telemedicine equipment prices were combined with installation, maintenance, training, and communication costs to obtain the lifecycle costs of the technology. If telemedicine were available to the fleet, ship medical staffs would initiate nearly 19, 000 consults in a year-7% of all patient visits. Telemedicine would enhance quality of care in two-thirds of these consults. Seventeen percent of the MEDEVACs would be preventable with telemedicine (representing 155,000 travel miles), with a savings of $4400 per MEDEVAC. If the ship's communication capabilities were available, e-mail and Internet and telephone and fax would be cost-effective on all ships (including small ships and submarines). Video teleconferencing would be cost-effective on large ships (aircraft carriers and amphibious) only. Teleradiology would be cost-effective

  14. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    Science.gov (United States)

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  15. Cost-effectiveness of cardiotocography plus ST analysis of the fetal electrocardiogram compared with cardiotocography only

    NARCIS (Netherlands)

    Vijgen, Sylvia M. C.; Westerhuis, Michelle E. M. H.; Opmeer, Brent C.; Visser, Gerard H. A.; Moons, Karl G. M.; Porath, Martina M.; Oei, Guid S.; van Geijn, Herman P.; Bolte, Antoinette C.; Willekes, Christine; Nijhuis, Jan G.; van Beek, Erik; Graziosi, Giuseppe C. M.; Schuitemaker, Nico W. E.; van Lith, Jan M. M.; van den Akker, Eline S. A.; Drogtrop, Addy P.; van Dessel, Hendrikus J. H. M.; Rijnders, Robbert J. P.; Oosterbaan, Herman P.; Mol, Ben Willem J.; Kwee, Anneke

    2011-01-01

    To assess the cost-effectiveness of addition of ST analysis of the fetal electrocardiogram (ECG; STAN) to cardiotocography (CTG) for fetal surveillance during labor compared with CTG only. Cost-effectiveness analysis based on a randomized clinical trial on ST analysis of the fetal ECG. Obstetric

  16. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  17. High throughput and accurate serum proteome profiling by integrated sample preparation technology and single-run data independent mass spectrometry analysis.

    Science.gov (United States)

    Lin, Lin; Zheng, Jiaxin; Yu, Quan; Chen, Wendong; Xing, Jinchun; Chen, Chenxi; Tian, Ruijun

    2018-03-01

    Mass spectrometry (MS)-based serum proteome analysis is extremely challenging due to its high complexity and dynamic range of protein abundances. Developing high throughput and accurate serum proteomic profiling approach capable of analyzing large cohorts is urgently needed for biomarker discovery. Herein, we report a streamlined workflow for fast and accurate proteomic profiling from 1μL of blood serum. The workflow combined an integrated technique for highly sensitive and reproducible sample preparation and a new data-independent acquisition (DIA)-based MS method. Comparing with standard data dependent acquisition (DDA) approach, the optimized DIA method doubled the number of detected peptides and proteins with better reproducibility. Without protein immunodepletion and prefractionation, the single-run DIA analysis enables quantitative profiling of over 300 proteins with 50min gradient time. The quantified proteins span more than five orders of magnitude of abundance range and contain over 50 FDA-approved disease markers. The workflow allowed us to analyze 20 serum samples per day, with about 358 protein groups per sample being identified. A proof-of-concept study on renal cell carcinoma (RCC) serum samples confirmed the feasibility of the workflow for large scale serum proteomic profiling and disease-related biomarker discovery. Blood serum or plasma is the predominant specimen for clinical proteomic studies while the analysis is extremely challenging for its high complexity. Many efforts had been made in the past for serum proteomics for maximizing protein identifications, whereas few have been concerned with throughput and reproducibility. Here, we establish a rapid, robust and high reproducible DIA-based workflow for streamlined serum proteomic profiling from 1μL serum. The workflow doesn't need protein depletion and pre-fractionation, while still being able to detect disease-relevant proteins accurately. The workflow is promising in clinical application

  18. Cost-effective analysis of unilateral vestibular weakness investigation.

    Science.gov (United States)

    Gandolfi, Michele M; Reilly, Erin K; Galatioto, Jessica; Judson, Randy B; Kim, Ana H

    2015-02-01

    To evaluate the cost-effectiveness of obtaining a magnetic resonance imaging (MRI) in patients with abnormal electronystagmography (ENG) or videonystagmography (VNG) results. Retrospective chart review. Academic specialty center. Patients presenting with vertigo between January 1, 2010, and August 30, 2013. Patients who fit the following abnormal criteria were included in the study: unilateral caloric weakness (≥20%), abnormal ocular motor testing, and nystagmus on positional testing. Patients with abnormal findings who then underwent MRI with gadolinium were evaluated. Of the 1,996 charts reviewed, there were 1,358 patients who met the inclusion criteria. The average age of these patients was 62 years (12-94 yr). The male:female ratio was approximately 1:2. Of the 1,358 patients, 253 received an MRI with the following pathologies: four vestibular schwannomas, three subcortical/periventricular white matter changes suspicious for demyelinating disease, four acute cerebellar/posterior circulation infarct, two vertebral artery narrowing, one pseudomeningocele of internal auditory canal, and two white matter changes indicative of migraines. The positive detection rate on MRI was 5.5% based on MRI findings of treatable pathologies causing vertigo. Average cost of an MRI is $1,200, thereby making the average cost of identifying a patient with a positive MRI finding $15,180. In our study, those patients with a positive MRI had a constellation of symptoms and findings (asymmetric sensorineural hearing loss, tinnitus, vertigo, and abnormal ENG/VNG). Cost-effectiveness can be improved by ordering an MRI only when clinical examination and VNG point toward a central pathology. Clinical examination and appropriate testing should be factored when considering the cost-effectiveness of obtaining an MRI in patients with abnormal ENG/VNG findings.

  19. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  20. Identification of miRNAs and their targets through high-throughput sequencing and degradome analysis in male and female Asparagus officinalis.

    Science.gov (United States)

    Chen, Jingli; Zheng, Yi; Qin, Li; Wang, Yan; Chen, Lifei; He, Yanjun; Fei, Zhangjun; Lu, Gang

    2016-04-12

    MicroRNAs (miRNAs), a class of non-coding small RNAs (sRNAs), regulate various biological processes. Although miRNAs have been identified and characterized in several plant species, miRNAs in Asparagus officinalis have not been reported. As a dioecious plant with homomorphic sex chromosomes, asparagus is regarded as an important model system for studying mechanisms of plant sex determination. Two independent sRNA libraries from male and female asparagus plants were sequenced with Illumina sequencing, thereby generating 4.13 and 5.88 million final clean reads, respectively. Both libraries predominantly contained 24-nt sRNAs, followed by 21-nt sRNAs. Further analysis identified 154 conserved miRNAs, which belong to 26 families, and 39 novel miRNA candidates seemed to be specific to asparagus. Comparative profiling revealed that 63 miRNAs exhibited significant differential expression between male and female plants, which was confirmed by real-time quantitative PCR analysis. Among them, 37 miRNAs were significantly up-regulated in the female library, whereas the others were preferentially expressed in the male library. Furthermore, 40 target mRNAs representing 44 conserved and seven novel miRNAs were identified in asparagus through high-throughput degradome sequencing. Functional annotation showed that these target mRNAs were involved in a wide range of developmental and metabolic processes. We identified a large set of conserved and specific miRNAs and compared their expression levels between male and female asparagus plants. Several asparagus miRNAs, which belong to the miR159, miR167, and miR172 families involved in reproductive organ development, were differentially expressed between male and female plants, as well as during flower development. Consistently, several predicted targets of asparagus miRNAs were associated with floral organ development. These findings suggest the potential roles of miRNAs in sex determination and reproductive developmental processes in

  1. Comparative analysis of miRNAs of two rapeseed genotypes in response to acetohydroxyacid synthase-inhibiting herbicides by high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Maolong Hu

    Full Text Available Acetohydroxyacid synthase (AHAS, also called acetolactate synthase, is a key enzyme involved in the first step of the biosynthesis of the branched-chain amino acids valine, isoleucine and leucine. Acetohydroxyacid synthase-inhibiting herbicides (AHAS herbicides are five chemical families of herbicides that inhibit AHAS enzymes, including imidazolinones (IMI, sulfonylureas (SU, pyrimidinylthiobenzoates, triazolinones and triazolopyrimidines. Five AHAS genes have been identified in rapeseed, but little information is available regarding the role of miRNAs in response to AHAS herbicides. In this study, an AHAS herbicides tolerant genotype and a sensitive genotype were used for miRNA comparative analysis. A total of 20 small RNA libraries were obtained of these two genotypes at three time points (0h, 24 h and 48 h after spraying SU and IMI herbicides with two replicates. We identified 940 conserved miRNAs and 1515 novel candidate miRNAs in Brassica napus using high-throughput sequencing methods combined with computing analysis. A total of 3284 genes were predicted to be targets of these miRNAs, and their functions were shown using GO, KOG and KEGG annotations. The differentiation expression results of miRNAs showed almost twice as many differentiated miRNAs were found in tolerant genotype M342 (309 miRNAs after SU herbicide application than in sensitive genotype N131 (164 miRNAs. In additiond 177 and 296 miRNAs defined as differentiated in sensitive genotype and tolerant genotype in response to SU herbicides. The miR398 family was observed to be associated with AHAS herbicide tolerance because their expression increased in the tolerant genotype but decreased in the sensitive genotype. Moreover, 50 novel miRNAs from 39 precursors were predicted. There were 8 conserved miRNAs, 4 novel miRNAs and 3 target genes were validated by quantitative real-time PCR experiment. This study not only provides novel insights into the miRNA content of AHAS herbicides

  2. Development of a fast isocratic LC-MS/MS method for the high-throughput analysis of pyrrolizidine alkaloids in Australian honey.

    Science.gov (United States)

    Griffin, Caroline T; Mitrovic, Simon M; Danaher, Martin; Furey, Ambrose

    2015-01-01

    Honey samples originating from Australia were purchased and analysed for targeted pyrrolizidine alkaloids (PAs) using a new and rapid isocratic LC-MS/MS method. This isocratic method was developed from, and is comparable with, a gradient elution method and resulted in no loss of sensitivity or reduction in chromatographic peak shape. Isocratic elution allows for significantly shorter run times (6 min), eliminates the requirement for column equilibration periods and, thus, has the advantage of facilitating a high-throughput analysis which is particularly important for regulatory testing laboratories. In excess of two hundred injections are possible, with this new isocratic methodology, within a 24-h period which is more than 50% improvement on all previously published methodologies. Good linear calibrations were obtained for all 10 PAs and four PA N-oxides (PANOs) in spiked honey samples (3.57-357.14 µg l(-1); R(2) ≥ 0.9987). Acceptable inter-day repeatability was achieved for the target analytes in honey with % RSD values (n = 4) less than 7.4%. Limits of detection (LOD) and limits of quantitation (LOQ) were achieved with spiked PAs and PANOs samples; giving an average LOD of 1.6 µg kg(-1) and LOQ of 5.4 µg kg(-1). This method was successfully applied to Australian and New Zealand honey samples sourced from supermarkets in Australia. Analysis showed that 41 of the 59 honey samples were contaminated by PAs with the mean total sum of PAs being 153 µg kg(-1). Echimidine and lycopsamine were predominant and found in 76% and 88%, respectively, of the positive samples. The average daily exposure, based on the results presented in this study, were 0.051 µg kg(-1) bw day(-1) for adults and 0.204 µg kg(-1) bw day(-1) for children. These results are a cause for concern when compared with the proposed European Food Safety Authority (EFSA), Committee on Toxicity (COT) and Bundesinstitut für Risikobewertung (BfR - Federal Institute of Risk Assessment Germany) maximum

  3. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  4. Cost-Effectiveness Analysis of Early Reading Programs: A Demonstration with Recommendations for Future Research

    Science.gov (United States)

    Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.

    2016-01-01

    We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…

  5. Identification of microRNAs in Caragana intermedia by high-throughput sequencing and expression analysis of 12 microRNAs and their targets under salt stress.

    Science.gov (United States)

    Zhu, Jianfeng; Li, Wanfeng; Yang, Wenhua; Qi, Liwang; Han, Suying

    2013-09-01

    142 miRNAs were identified and 38 miRNA targets were predicted, 4 of which were validated, in C. intermedia . The expression of 12 miRNAs in salt-stressed leaves was assessed by qRT-PCR. MicroRNAs (miRNAs) are endogenous small RNAs that play important roles in various biological and metabolic processes in plants. Caragana intermedia is an important ecological and economic tree species prominent in the desert environment of west and northwest China. To date, no investigation into C. intermedia miRNAs has been reported. In this study, high-throughput sequencing of small RNAs and analysis of transcriptome data were performed to identify both conserved and novel miRNAs, and also their target mRNA genes in C. intermedia. Based on sequence similarity and hairpin structure prediction, 132 putative conserved miRNAs (12 of which were confirmed to form hairpin precursors) belonging to 31 known miRNA families were identified. Ten novel miRNAs (including the miRNA* sequences of three novel miRNAs) were also discovered. Furthermore, 36 potential target genes of 17 known miRNA families and 2 potential target genes of 1 novel miRNA were predicted; 4 of these were validated by 5' RACE. The expression of 12 miRNAs was validated in different tissues, and these and five target mRNAs were assessed by qRT-PCR after salt treatment. The expression levels of seven miRNAs (cin-miR157a, cin-miR159a, cin-miR165a, cin-miR167b, cin-miR172b, cin-miR390a and cin-miR396a) were upregulated, while cin-miR398a expression was downregulated after salt treatment. The targets of cin-miR157a, cin-miR165a, cin-miR172b and cin-miR396a were downregulated and showed an approximately negative correlation with their corresponding miRNAs under salt treatment. These results would help further understanding of miRNA regulation in response to abiotic stress in C. intermedia.

  6. Third Generation (3G) Site Characterization: Cryogenic Core Collection and High Throughput Core Analysis - An Addendum to Basic Research Addressing Contaminants in Low Permeability Zones - A State of the Science Review

    Science.gov (United States)

    2016-07-29

    Styrofoam insulation for keeping the core frozen during MRI .................................. 78 Figure 5-2. Schematic of reference and core setting in... Hollow -Stem Auger HTCA High-Throughput Core Analysis IC Ion Chromatograph ID Inner Diameter k Permeability LN Liquid Nitrogen LNAPL Light...vibration, or “over drilling” using a hollow -stem auger. The ratio of the length of the collected core to the depth over which the sample tube is

  7. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    International Nuclear Information System (INIS)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-01-01

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform

  8. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  9. High-throughput sequencing and analysis of the gill tissue transcriptome from the deep-sea hydrothermal vent mussel Bathymodiolus azoricus

    Directory of Open Access Journals (Sweden)

    Gomes Paula

    2010-10-01

    Full Text Available Abstract Background Bathymodiolus azoricus is a deep-sea hydrothermal vent mussel found in association with large faunal communities living in chemosynthetic environments at the bottom of the sea floor near the Azores Islands. Investigation of the exceptional physiological reactions that vent mussels have adopted in their habitat, including responses to environmental microbes, remains a difficult challenge for deep-sea biologists. In an attempt to reveal genes potentially involved in the deep-sea mussel innate immunity we carried out a high-throughput sequence analysis of freshly collected B. azoricus transcriptome using gills tissues as the primary source of immune transcripts given its strategic role in filtering the surrounding waterborne potentially infectious microorganisms. Additionally, a substantial EST data set was produced and from which a comprehensive collection of genes coding for putative proteins was organized in a dedicated database, "DeepSeaVent" the first deep-sea vent animal transcriptome database based on the 454 pyrosequencing technology. Results A normalized cDNA library from gills tissue was sequenced in a full 454 GS-FLX run, producing 778,996 sequencing reads. Assembly of the high quality reads resulted in 75,407 contigs of which 3,071 were singletons. A total of 39,425 transcripts were conceptually translated into amino-sequences of which 22,023 matched known proteins in the NCBI non-redundant protein database, 15,839 revealed conserved protein domains through InterPro functional classification and 9,584 were assigned with Gene Ontology terms. Queries conducted within the database enabled the identification of genes putatively involved in immune and inflammatory reactions which had not been previously evidenced in the vent mussel. Their physical counterpart was confirmed by semi-quantitative quantitative Reverse-Transcription-Polymerase Chain Reactions (RT-PCR and their RNA transcription level by quantitative PCR (q

  10. RNAi High-Throughput Screening of Single- and Multi-Cell-Type Tumor Spheroids: A Comprehensive Analysis in Two and Three Dimensions.

    Science.gov (United States)

    Fu, Jiaqi; Fernandez, Daniel; Ferrer, Marc; Titus, Steven A; Buehler, Eugen; Lal-Nag, Madhu A

    2017-06-01

    The widespread use of two-dimensional (2D) monolayer cultures for high-throughput screening (HTS) to identify targets in drug discovery has led to attrition in the number of drug targets being validated. Solid tumors are complex, aberrantly growing microenvironments that harness structural components from stroma, nutrients fed through vasculature, and immunosuppressive factors. Increasing evidence of stromally-derived signaling broadens the complexity of our understanding of the tumor microenvironment while stressing the importance of developing better models that reflect these interactions. Three-dimensional (3D) models may be more sensitive to certain gene-silencing events than 2D models because of their components of hypoxia, nutrient gradients, and increased dependence on cell-cell interactions and therefore are more representative of in vivo interactions. Colorectal cancer (CRC) and breast cancer (BC) models composed of epithelial cells only, deemed single-cell-type tumor spheroids (SCTS) and multi-cell-type tumor spheroids (MCTS), containing fibroblasts were developed for RNAi HTS in 384-well microplates with flat-bottom wells for 2D screening and round-bottom, ultra-low-attachment wells for 3D screening. We describe the development of a high-throughput assay platform that can assess physiologically relevant phenotypic differences between screening 2D versus 3D SCTS, 3D SCTS, and MCTS in the context of different cancer subtypes. This assay platform represents a paradigm shift in how we approach drug discovery that can reduce the attrition rate of drugs that enter the clinic.

  11. Reduced dimensionality (3,2)D NMR experiments and their automated analysis: implications to high-throughput structural studies on proteins.

    Science.gov (United States)

    Reddy, Jithender G; Kumar, Dinesh; Hosur, Ramakrishna V

    2015-02-01

    Protein NMR spectroscopy has expanded dramatically over the last decade into a powerful tool for the study of their structure, dynamics, and interactions. The primary requirement for all such investigations is sequence-specific resonance assignment. The demand now is to obtain this information as rapidly as possible and in all types of protein systems, stable/unstable, soluble/insoluble, small/big, structured/unstructured, and so on. In this context, we introduce here two reduced dimensionality experiments – (3,2)D-hNCOcanH and (3,2)D-hNcoCAnH – which enhance the previously described 2D NMR-based assignment methods quite significantly. Both the experiments can be recorded in just about 2-3 h each and hence would be of immense value for high-throughput structural proteomics and drug discovery research. The applicability of the method has been demonstrated using alpha-helical bovine apo calbindin-D9k P43M mutant (75 aa) protein. Automated assignment of this data using AUTOBA has been presented, which enhances the utility of these experiments. The backbone resonance assignments so derived are utilized to estimate secondary structures and the backbone fold using Web-based algorithms. Taken together, we believe that the method and the protocol proposed here can be used for routine high-throughput structural studies of proteins. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Diagnostic staging laparoscopy in gastric cancer treatment: A cost-effectiveness analysis.

    Science.gov (United States)

    Li, Kevin; Cannon, John G D; Jiang, Sam Y; Sambare, Tanmaya D; Owens, Douglas K; Bendavid, Eran; Poultsides, George A

    2018-05-01

    Accurate preoperative staging helps avert morbidity, mortality, and cost associated with non-therapeutic laparotomy in gastric cancer (GC) patients. Diagnostic staging laparoscopy (DSL) can detect metastases with high sensitivity, but its cost-effectiveness has not been previously studied. We developed a decision analysis model to assess the cost-effectiveness of preoperative DSL in GC workup. Analysis was based on a hypothetical cohort of GC patients in the U.S. for whom initial imaging shows no metastases. The cost-effectiveness of DSL was measured as cost per quality-adjusted life-year (QALY) gained. Drivers of cost-effectiveness were assessed in sensitivity analysis. Preoperative DSL required an investment of $107 012 per QALY. In sensitivity analysis, DSL became cost-effective at a threshold of $100 000/QALY when the probability of occult metastases exceeded 31.5% or when test sensitivity for metastases exceeded 86.3%. The likelihood of cost-effectiveness increased from 46% to 93% when both parameters were set at maximum reported values. The cost-effectiveness of DSL for GC patients is highly dependent on patient and test characteristics, and is more likely when DSL is used selectively where procedure yield is high, such as for locally advanced disease or in detecting peritoneal and superficial versus deep liver lesions. © 2017 Wiley Periodicals, Inc.

  13. Correction of Microplate Data from High-Throughput Screening.

    Science.gov (United States)

    Wang, Yuhong; Huang, Ruili

    2016-01-01

    High-throughput screening (HTS) makes it possible to collect cellular response data from a large number of cell lines and small molecules in a timely and cost-effective manner. The errors and noises in the microplate-formatted data from HTS have unique characteristics, and they can be generally grouped into three categories: run-wise (temporal, multiple plates), plate-wise (background pattern, single plate), and well-wise (single well). In this chapter, we describe a systematic solution for identifying and correcting such errors and noises, mainly basing on pattern recognition and digital signal processing technologies.

  14. A cost-effectiveness analysis of two different antimicrobial stewardship programs.

    Science.gov (United States)

    Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia

    2016-01-01

    There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.

  15. A cost-effectiveness analysis of two different antimicrobial stewardship programs

    Directory of Open Access Journals (Sweden)

    Lucas Miyake Okumura

    2016-05-01

    Full Text Available There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70, however, it was more efficient (US$ 27,549.15 vs 29,011.46. Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective.

  16. Antibiotic prophylaxis for haematogenous bacterial arthritis in patients with joint disease: a cost effectiveness analysis

    NARCIS (Netherlands)

    P. Krijnen (Pieta); C.J. Kaandorp; E.W. Steyerberg (Ewout); D. van Schaardenburg (Dirkjan); H.J. Moens; J.D.F. Habbema (Dik)

    2001-01-01

    textabstractOBJECTIVE: To assess the cost effectiveness of antibiotic prophylaxis for haematogenous bacterial arthritis in patients with joint disease. METHODS: In a decision analysis, data from a prospective study on bacterial arthritis in 4907 patients with joint

  17. Cost Effectiveness Analysis of Converting a Classroom Course to a Network Based Instruction Module

    National Research Council Canada - National Science Library

    green, Samantha

    1997-01-01

    ...) classes into NBL modules. This thesis performs a cost effectiveness analysis on converting the two modules and discusses the intangible costs and benefits associated with converting traditional classroom courses...

  18. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    Science.gov (United States)

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  20. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  1. A comprehensive analysis of in vitro and in vivo genetic fitness of Pseudomonas aeruginosa using high-throughput sequencing of transposon libraries.

    Directory of Open Access Journals (Sweden)

    David Skurnik

    Full Text Available High-throughput sequencing of transposon (Tn libraries created within entire genomes identifies and quantifies the contribution of individual genes and operons to the fitness of organisms in different environments. We used insertion-sequencing (INSeq to analyze the contribution to fitness of all non-essential genes in the chromosome of Pseudomonas aeruginosa strain PA14 based on a library of ∼300,000 individual Tn insertions. In vitro growth in LB provided a baseline for comparison with the survival of the Tn insertion strains following 6 days of colonization of the murine gastrointestinal tract as well as a comparison with Tn-inserts subsequently able to systemically disseminate to the spleen following induction of neutropenia. Sequencing was performed following DNA extraction from the recovered bacteria, digestion with the MmeI restriction enzyme that hydrolyzes DNA 16 bp away from the end of the Tn insert, and fractionation into oligonucleotides of 1,200-1,500 bp that were prepared for high-throughput sequencing. Changes in frequency of Tn inserts into the P. aeruginosa genome were used to quantify in vivo fitness resulting from loss of a gene. 636 genes had <10 sequencing reads in LB, thus defined as unable to grow in this medium. During in vivo infection there were major losses of strains with Tn inserts in almost all known virulence factors, as well as respiration, energy utilization, ion pumps, nutritional genes and prophages. Many new candidates for virulence factors were also identified. There were consistent changes in the recovery of Tn inserts in genes within most operons and Tn insertions into some genes enhanced in vivo fitness. Strikingly, 90% of the non-essential genes were required for in vivo survival following systemic dissemination during neutropenia. These experiments resulted in the identification of the P. aeruginosa strain PA14 genes necessary for optimal survival in the mucosal and systemic environments of a mammalian

  2. Comparing Usage and Cost- Effectiveness Analysis of English Printed and Electronic Books for University of Tehran

    OpenAIRE

    Davoud Haseli; Nader Naghshineh; fatemeh Fahimnia

    2014-01-01

    Libraries operate in a competitive environment, and this is essentially needed to prove its benefits for stockholders, and continuously evaluate and compare advantages for printed and electronic resources. In these cases, economic evaluation methods such as cost- effectiveness analysis, is one of the best methods, because of a comprehensive study of the use and cost of library sources. The purpose of this study is to discovery of use and cost- effectiveness analysis of English printed and ebo...

  3. Smoking Cessation for Smokers Not Ready to Quit: Meta-analysis and Cost-effectiveness Analysis.

    Science.gov (United States)

    Ali, Ayesha; Kaplan, Cameron M; Derefinko, Karen J; Klesges, Robert C

    2018-06-11

    To provide a systematic review and cost-effectiveness analysis on smoking interventions targeting smokers not ready to quit, a population that makes up approximately 32% of current smokers. Twenty-two studies on pharmacological, behavioral, and combination smoking-cessation interventions targeting smokers not ready to quit (defined as those who reported they were not ready to quit at the time of the study) published between 2000 and 2017 were analyzed. The effectiveness (measured by the number needed to treat) and cost effectiveness (measured by costs per quit) of interventions were calculated. All data collection and analyses were performed in 2017. Smoking interventions targeting smokers not ready to quit can be as effective as similar interventions for smokers ready to quit; however, costs of intervening on this group may be higher for some intervention types. The most cost-effective interventions identified for this group were those using varenicline and those using behavioral interventions. Updating clinical recommendations to provide cessation interventions for this group is recommended. Further research on development of cost-effective treatments and effective strategies for recruitment and outreach for this group are needed. Additional studies may allow for more nuanced comparisons of treatment types among this group. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  4. Cost-effectiveness analysis of rotavirus vaccination among Libyan ...

    African Journals Online (AJOL)

    Methods: We used a published decision tree model that has been adapted to the Libyan situation to analyze a birth cohort of 160,000 children. The evaluation of diarrhea events in three public hospitals helped to estimate the rotavirus burden. The economic analysis was done from two perspectives: health care provider and ...

  5. In House HSV PCR, Process Improvement and Cost Effectiveness Analysis

    Science.gov (United States)

    2017-09-15

    TYPE 09/15/2017 Poster 4. TITLE AND SUBTITLE Cost-Analysis: In-hous(l HSV P(’R capabilities 6. AUTHOR(S) Ma.i Nich() las R CaJT 7. PERFORMING...ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMIT A TIC ".’ OF 18. NUMBER a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF PAGES 3

  6. A cost-effectiveness analysis of two different antimicrobial stewardship programs

    OpenAIRE

    Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia

    2016-01-01

    Abstract There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial s...

  7. A cost-effectiveness analysis of two different antimicrobial stewardship programs

    OpenAIRE

    Lucas Miyake Okumura; Bruno Salgado Riveros; Monica Maria Gomes-da-Silva; Izelandia Veroneze

    2016-01-01

    There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardshi...

  8. Heterogeneous Deployment Analysis for Cost-Effective Mobile Network Evolution

    DEFF Research Database (Denmark)

    Coletti, Claudio

    2013-01-01

    network coverage and boosting network capacity in traffic hot-spot areas. The thesis deals with the deployment of both outdoor small cells and indoor femto cells. Amongst the outdoor solution, particular emphasis is put on relay base stations as backhaul costs can be reduced by utilizing LTE spectrum...... statistical models of deployment areas, the performance analysis is carried out in the form of operator case studies for large-scale deployment scenarios, including realistic macro network layouts and inhomogeneous spatial traffic distributions. Deployment of small cells is performed by means of proposed...... heuristic deployment algorithms, which combine network coverage and spatial user density information. As a secondary aspect, deployment solutions achieving the same coverage performance are compared in terms of Total Cost of Ownership (TCO), in order to investigate the viability of different deployment...

  9. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    Science.gov (United States)

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus

    Science.gov (United States)

    Ulfsdotter, Malin

    2015-01-01

    Objective There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs) as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC). The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness. Methods A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data. Results The cost was €326.3 per parent, of which €53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and €272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of €47 290 per gained QALY. The sensitivity analyses resulted in ratios from €41 739 to €55 072. With the common Swedish threshold value of €55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent. Conclusion Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation. PMID:26681349

  11. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus.

    Directory of Open Access Journals (Sweden)

    Malin Ulfsdotter

    Full Text Available There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC. The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness.A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data.The cost was € 326.3 per parent, of which € 53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and € 272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of € 47 290 per gained QALY. The sensitivity analyses resulted in ratios from € 41 739 to € 55 072. With the common Swedish threshold value of € 55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent.Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation.

  12. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus.

    Science.gov (United States)

    Ulfsdotter, Malin; Lindberg, Lene; Månsdotter, Anna

    2015-01-01

    There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs) as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC). The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness. A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data. The cost was € 326.3 per parent, of which € 53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and € 272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of € 47 290 per gained QALY. The sensitivity analyses resulted in ratios from € 41 739 to € 55 072. With the common Swedish threshold value of € 55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent. Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation.

  13. Integrated analysis of RNA-binding protein complexes using in vitro selection and high-throughput sequencing and sequence specificity landscapes (SEQRS).

    Science.gov (United States)

    Lou, Tzu-Fang; Weidmann, Chase A; Killingsworth, Jordan; Tanaka Hall, Traci M; Goldstrohm, Aaron C; Campbell, Zachary T

    2017-04-15

    RNA-binding proteins (RBPs) collaborate to control virtually every aspect of RNA function. Tremendous progress has been made in the area of global assessment of RBP specificity using next-generation sequencing approaches both in vivo and in vitro. Understanding how protein-protein interactions enable precise combinatorial regulation of RNA remains a significant problem. Addressing this challenge requires tools that can quantitatively determine the specificities of both individual proteins and multimeric complexes in an unbiased and comprehensive way. One approach utilizes in vitro selection, high-throughput sequencing, and sequence-specificity landscapes (SEQRS). We outline a SEQRS experiment focused on obtaining the specificity of a multi-protein complex between Drosophila RBPs Pumilio (Pum) and Nanos (Nos). We discuss the necessary controls in this type of experiment and examine how the resulting data can be complemented with structural and cell-based reporter assays. Additionally, SEQRS data can be integrated with functional genomics data to uncover biological function. Finally, we propose extensions of the technique that will enhance our understanding of multi-protein regulatory complexes assembled onto RNA. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Cost-effectiveness analysis of combination therapies for visceral leishmaniasis in the Indian subcontinent.

    Directory of Open Access Journals (Sweden)

    Filip Meheus

    2010-09-01

    Full Text Available Visceral leishmaniasis is a systemic parasitic disease that is fatal unless treated. We assessed the cost and cost-effectiveness of alternative strategies for the treatment of visceral leishmaniasis in the Indian subcontinent. In particular we examined whether combination therapies are a cost-effective alternative compared to monotherapies.We assessed the cost-effectiveness of all possible mono- and combination therapies for the treatment of visceral leishmaniasis in the Indian subcontinent (India, Nepal and Bangladesh from a societal perspective using a decision analytical model based on a decision tree. Primary data collected in each country was combined with data from the literature and an expert poll (Delphi method. The cost per patient treated and average and incremental cost-effectiveness ratios expressed as cost per death averted were calculated. Extensive sensitivity analysis was done to evaluate the robustness of our estimations and conclusions. With a cost of US$92 per death averted, the combination miltefosine-paromomycin was the most cost-effective treatment strategy. The next best alternative was a combination of liposomal amphotericin B with paromomycin with an incremental cost-effectiveness of $652 per death averted. All other strategies were dominated with the exception of a single dose of 10mg per kg of liposomal amphotericin B. While strategies based on liposomal amphotericin B (AmBisome were found to be the most effective, its current drug cost of US$20 per vial resulted in a higher average cost-effectiveness. Sensitivity analysis showed the conclusion to be robust to variations in the input parameters over their plausible range.Combination treatments are a cost-effective alternative to current monotherapy for VL. Given their expected impact on the emergence of drug resistance, a switch to combination therapy should be considered once final results from clinical trials are available.

  15. Automated high-throughput measurement of body movements and cardiac activity of Xenopus tropicalis tadpoles

    Directory of Open Access Journals (Sweden)

    Kay Eckelt

    2014-07-01

    Full Text Available Xenopus tadpoles are an emerging model for developmental, genetic and behavioral studies. A small size, optical accessibility of most of their organs, together with a close genetic and structural relationship to humans make them a convenient experimental model. However, there is only a limited toolset available to measure behavior and organ function of these animals at medium or high-throughput. Herein, we describe an imaging-based platform to quantify body and autonomic movements of Xenopus tropicalis tadpoles of advanced developmental stages. Animals alternate periods of quiescence and locomotor movements and display buccal pumping for oxygen uptake from water and rhythmic cardiac movements. We imaged up to 24 animals in parallel and automatically tracked and quantified their movements by using image analysis software. Animal trajectories, moved distances, activity time, buccal pumping rates and heart beat rates were calculated and used to characterize the effects of test compounds. We evaluated the effects of propranolol and atropine, observing a dose-dependent bradycardia and tachycardia, respectively. This imaging and analysis platform is a simple, cost-effective high-throughput in vivo assay system for genetic, toxicological or pharmacological characterizations.

  16. Challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials

    Directory of Open Access Journals (Sweden)

    Yunbo Chu

    2016-10-01

    Full Text Available Economic evaluation in the form of cost-effectiveness analysis has become a popular means to inform decisions in healthcare. With multi-regional clinical trials in a global development program becoming a new venue for drug efficacy testing in recent decades, questions in methods for cost-effectiveness analysis in the multi-regional clinical trials setting also emerge. This paper addresses some challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials. Several discussion points are raised for further attention and a multi-regional clinical trial example is presented to illustrate the implications in industrial application. A general message is delivered to call for a depth discussion by all stakeholders to reach an agreement on a good practice in cost-effectiveness analysis in the multi-regional clinical trials. Meanwhile, we recommend an additional consideration of cost-effectiveness analysis results based on the clinical evidence from a certain homogeneous population as sensitivity or scenario analysis upon data availability.

  17. Nonintravenous rescue medications for pediatric status epilepticus: A cost-effectiveness analysis.

    Science.gov (United States)

    Sánchez Fernández, Iván; Gaínza-Lein, Marina; Loddenkemper, Tobias

    2017-08-01

    To quantify the cost-effectiveness of rescue medications for pediatric status epilepticus: rectal diazepam, nasal midazolam, buccal midazolam, intramuscular midazolam, and nasal lorazepam. Decision analysis model populated with effectiveness data from the literature and cost data from publicly available market prices. The primary outcome was cost per seizure stopped ($/SS). One-way sensitivity analyses and second-order Monte Carlo simulations evaluated the robustness of the results across wide variations of the input parameters. The most cost-effective rescue medication was buccal midazolam (incremental cost-effectiveness ratio ([ICER]: $13.16/SS) followed by nasal midazolam (ICER: $38.19/SS). Nasal lorazepam (ICER: -$3.8/SS), intramuscular midazolam (ICER: -$64/SS), and rectal diazepam (ICER: -$2,246.21/SS) are never more cost-effective than the other options at any willingness to pay. One-way sensitivity analysis showed the following: (1) at its current effectiveness, rectal diazepam would become the most cost-effective option only if its cost was $6 or less, and (2) at its current cost, rectal diazepam would become the most cost-effective option only if effectiveness was higher than 0.89 (and only with very high willingness to pay of $2,859/SS to $31,447/SS). Second-order Monte Carlo simulations showed the following: (1) nasal midazolam and intramuscular midazolam were the more effective options; (2) the more cost-effective option was buccal midazolam for a willingness to pay from $14/SS to $41/SS and nasal midazolam for a willingness to pay above $41/SS; (3) cost-effectiveness overlapped for buccal midazolam, nasal lorazepam, intramuscular midazolam, and nasal midazolam; and (4) rectal diazepam was not cost-effective at any willingness to pay, and this conclusion remained extremely robust to wide variations of the input parameters. For pediatric status epilepticus, buccal midazolam and nasal midazolam are the most cost-effective nonintravenous rescue

  18. Development of a high-throughput real time PCR based on a hot-start alternative for Pfu mediated by quantum dots

    Science.gov (United States)

    Sang, Fuming; Yang, Yang; Yuan, Lin; Ren, Jicun; Zhang, Zhizhou

    2015-09-01

    Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour preincubation at 50 °C before real time PCR. Moreover, the results obtained by QD-based HS PCR were comparable to a commercial Taq antibody DNA polymerase. However, no obvious HS effect of QDs was found in real time PCR using Taq DNA polymerase. The findings of this study demonstrated that a cost-effective high-throughput real time PCR based on QD triggered HS PCR could be established with high consistency, sensitivity and accuracy.Hot start (HS) PCR is an excellent alternative for high-throughput real time PCR due to its ability to prevent nonspecific amplification at low temperature. Development of a cost-effective and simple HS PCR technique to guarantee high-throughput PCR specificity and consistency still remains a great challenge. In this study, we systematically investigated the HS characteristics of QDs triggered in real time PCR with EvaGreen and SYBR Green I dyes by the analysis of amplification curves, standard curves and melting curves. Two different kinds of DNA polymerases, Pfu and Taq, were employed. Here we showed that high specificity and efficiency of real time PCR were obtained in a plasmid DNA and an error-prone two-round PCR assay using QD-based HS PCR, even after an hour

  19. The analysis of cost-effectiveness of implant and conventional fixed dental prosthesis.

    Science.gov (United States)

    Chun, June Sang; Har, Alix; Lim, Hyun-Pil; Lim, Hoi-Jeong

    2016-02-01

    This study conducted an analysis of cost-effectiveness of the implant and conventional fixed dental prosthesis (CFDP) from a single treatment perspective. The Markov model for cost-effectiveness analysis of the implant and CFDP was carried out over maximum 50 years. The probabilistic sensitivity analysis was performed by the 10,000 Monte-Carlo simulations, and cost-effectiveness acceptability curves (CEAC) were also presented. The results from meta-analysis studies were used to determine the survival rates and complication rates of the implant and CFDP. Data regarding the cost of each treatment method were collected from University Dental Hospital and Statistics Korea for 2013. Using the results of the patient satisfaction survey study, quality-adjusted prosthesis year (QAPY) of the implant and CFDP strategy was evaluated with annual discount rate. When only the direct cost was considered, implants were more cost-effective when the willingness to pay (WTP) was more than 10,000 won at 10(th) year after the treatment, and more cost-effective regardless of the WTP from 20(th) year after the prosthodontic treatment. When the indirect cost was added to the direct cost, implants were more cost-effective only when the WTP was more than 75,000 won at the 10(th) year after the prosthodontic treatment, more than 35,000 won at the 20(th) year after prosthodontic treatment. The CFDP was more cost-effective unless the WTP was more than 75,000 won at the 10(th) year after prosthodontic treatment. But the cost-effectivenss tendency changed from CFDP to implant as time passed.

  20. Assessing the value of mepolizumab for severe eosinophilic asthma: a cost-effectiveness analysis.

    Science.gov (United States)

    Whittington, Melanie D; McQueen, R Brett; Ollendorf, Daniel A; Tice, Jeffrey A; Chapman, Richard H; Pearson, Steven D; Campbell, Jonathan D

    2017-02-01

    Adding mepolizumab to standard treatment with inhaled corticosteroids and controller medications could decrease asthma exacerbations and use of long-term oral steroids in patients with severe disease and increased eosinophils; however, mepolizumab is costly and its cost effectiveness is unknown. To estimate the cost effectiveness of mepolizumab. A Markov model was used to determine the incremental cost per quality-adjusted life year (QALY) gained for mepolizumab plus standard of care (SoC) and for SoC alone. The population, adults with severe eosinophilic asthma, was modeled for a lifetime time horizon. A responder scenario analysis was conducted to determine the cost effectiveness for a cohort able to achieve and maintain asthma control. Over a lifetime treatment horizon, 23.96 exacerbations were averted per patient receiving mepolizumab plus SoC. Avoidance of exacerbations and decrease in long-term oral steroid use resulted in more than $18,000 in cost offsets among those receiving mepolizumab, but treatment costs increased by more than $600,000. Treatment with mepolizumab plus SoC vs SoC alone resulted in a cost-effectiveness estimate of $386,000 per QALY. To achieve cost effectiveness of approximately $150,000 per QALY, mepolizumab would require a more than 60% price discount. At current pricing, treating a responder cohort yielded cost-effectiveness estimates near $160,000 per QALY. The estimated cost effectiveness of mepolizumab exceeds value thresholds. Achieving these thresholds would require significant discounts from the current list price. Alternatively, treatment limited to responders improves the cost effectiveness toward, but remains still slightly above, these thresholds. Payers interested in improving the efficiency of health care resources should consider negotiations of the mepolizumab price and ways to predict and assess the response to mepolizumab. Copyright © 2016 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All

  1. Identification and Analysis of Red Sea Mangrove (Avicennia marina) microRNAs by High-Throughput Sequencing and Their Association with Stress Responses

    KAUST Repository

    Khraiwesh, Basel; Pugalenthi, Ganesan; Fedoroff, Nina V.

    2013-01-01

    Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt) are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration. © 2013 Khraiwesh et al.

  2. Identification and analysis of red sea mangrove (Avicennia marina microRNAs by high-throughput sequencing and their association with stress responses.

    Directory of Open Access Journals (Sweden)

    Basel Khraiwesh

    Full Text Available Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration.

  3. Identification and Analysis of Red Sea Mangrove (Avicennia marina) microRNAs by High-Throughput Sequencing and Their Association with Stress Responses

    KAUST Repository

    Khraiwesh, Basel

    2013-04-08

    Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt) are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration. © 2013 Khraiwesh et al.

  4. A high-throughput quantitative expression analysis of cancer-related genes in human HepG2 cells in response to limonene, a potential anticancer agent.

    Science.gov (United States)

    Hafidh, Rand R; Hussein, Saba Z; MalAllah, Mohammed Q; Abdulamir, Ahmed S; Abu Bakar, Fatimah

    2017-11-14

    Citrus bioactive compounds, as active anticancer agent, have been under focus by several studies worldwide. However, the underlying genes responsible for the anticancer potential have not been sufficiently highlighted. The current study investigated the gene expression profile of hepatocellular carcinoma, HepG2, cells after treatment with Limonene. The concentration that killed 50% of HepG2 cells was used to elucidate the genetic mechanisms of limonene anticancer activity. The apoptotic induction was detected by flow cytometry and confocal fluorescence microscope. Two of pro-apoptotic events, caspase-3 activation and phosphatidylserine translocation were manifested by confocal fluorescence microscopy. High-throughput real-time PCR was used to profile 1023 cancer-related genes in 16 different gene families related to the cancer development. In comparison to untreated cells, limonene increased the percentage of apoptotic cells up to 89.61%, by flow cytometry, and 48.2% by fluorescence microscopy. There was a significant limonene-driven differential gene expression of HepG2 cells in 15 different gene families. Limonene was shown to significantly (>2log) up-regulate and down-regulate 14 and 59 genes, respectively. The affected gene families, from most to least affected, were apoptosis induction, signal transduction, cancer genes augmentation, alteration in kinases expression, inflammation, DNA damage repair, and cell cycle proteins. The current study reveals that limonene could be a promising, cheap, and effective anticancer compound. The broad spectrum of limonene anticancer activity is interesting for anticancer drug development. Further research is needed to confirm the current findings and to examine the anticancer potential of limonene along with underlying mechanisms on different cell lines. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Cost-effectiveness analysis: what it really means for transfusion medicine decision making.

    Science.gov (United States)

    Custer, Brian; Hoch, Jeffrey S

    2009-01-01

    Some have suggested that "blood is different," and the role for cost-effectiveness is thus circumscribed. In this article, the authors start by reviewing key concepts in health economics and economic analysis methods. Examples are drawn from published blood safety studies. After explaining the underlying reasoning behind cost-effectiveness analysis, the authors point out how economic thinking is evident in some aspects of transfusion medicine. Some cost-effectiveness study results for blood safety are discussed to provide context, followed by consideration of prominent decisions that have been made in transfusion medicine field. In the last section, the authors conjecture as to why in some cases cost-effectiveness analysis appears to have greater impact than in others, noting the terrible price that is paid in mortality and morbidity when cost-effectiveness analysis is ignored. In this context, the implications of opportunity cost are discussed, and it is noted that opportunity cost should not be viewed as benefits forgone by concentrating on one aspect of blood safety and instead should be viewed as our societal willingness to misallocate resources to achieve less health for the same cost.

  6. Cost-Effectiveness Analysis of Second-Line Chemotherapy Agents for Advanced Gastric Cancer.

    Science.gov (United States)

    Lam, Simon W; Wai, Maya; Lau, Jessica E; McNamara, Michael; Earl, Marc; Udeh, Belinda

    2017-01-01

    Gastric cancer is the fifth most common malignancy and second leading cause of cancer-related mortality. Chemotherapy options for patients who fail first-line treatment are limited. Thus the objective of this study was to assess the cost-effectiveness of second-line treatment options for patients with advanced or metastatic gastric cancer. Cost-effectiveness analysis using a Markov model to compare the cost-effectiveness of six possible second-line treatment options for patients with advanced gastric cancer who have failed previous chemotherapy: irinotecan, docetaxel, paclitaxel, ramucirumab, paclitaxel plus ramucirumab, and palliative care. The model was performed from a third-party payer's perspective to compare lifetime costs and health benefits associated with studied second-line therapies. Costs included only relevant direct medical costs. The model assumed chemotherapy cycle lengths of 30 days and a maximum number of 24 cycles. Systematic review of literature was performed to identify clinical data sources and utility and cost data. Quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios (ICERs) were calculated. The primary outcome measure for this analysis was the ICER between different therapies, where the incremental cost was divided by the number of QALYs saved. The ICER was compared with a willingness-to-pay (WTP) threshold that was set at $50,000/QALY gained, and an exploratory analysis using $160,000/QALY gained was also used. The model's robustness was tested by using 1-way sensitivity analyses and a 10,000 Monte Carlo simulation probabilistic sensitivity analysis (PSA). Irinotecan had the lowest lifetime cost and was associated with a QALY gain of 0.35 year. Docetaxel, ramucirumab alone, and palliative care were dominated strategies. Paclitaxel and the combination of paclitaxel plus ramucirumab led to higher QALYs gained, at an incremental cost of $86,815 and $1,056,125 per QALY gained, respectively. Based on our prespecified

  7. [Cost-effectiveness analysis and diet quality index applied to the WHO Global Strategy].

    Science.gov (United States)

    Machado, Flávia Mori Sarti; Simões, Arlete Naresse

    2008-02-01

    To test the use of cost-effectiveness analysis as a decision making tool in the production of meals for the inclusion of the recommendations published in the World Health Organization's Global Strategy. Five alternative options for breakfast menu were assessed previously to their adoption in a food service at a university in the state of Sao Paulo, Southeastern Brazil, in 2006. Costs of the different options were based on market prices of food items (direct cost). Health benefits were estimated based on adaptation of the Diet Quality Index (DQI). Cost-effectiveness ratios were estimated by dividing benefits by costs and incremental cost-effectiveness ratios were estimated as cost differential per unit of additional benefit. The meal choice was based on health benefit units associated to direct production cost as well as incremental effectiveness per unit of differential cost. The analysis showed the most simple option with the addition of a fruit (DQI = 64 / cost = R$ 1.58) as the best alternative. Higher effectiveness was seen in the options with a fruit portion (DQI1=64 / DQI3=58 / DQI5=72) compared to the others (DQI2=48 / DQI4=58). The estimate of cost-effectiveness ratio allowed to identifying the best breakfast option based on cost-effectiveness analysis and Diet Quality Index. These instruments allow easy application easiness and objective evaluation which are key to the process of inclusion of public or private institutions under the Global Strategy directives.

  8. Cost-effectiveness of competing strategies for management of recurrent Clostridium difficile infection: a decision analysis.

    Science.gov (United States)

    Konijeti, Gauree G; Sauk, Jenny; Shrime, Mark G; Gupta, Meera; Ananthakrishnan, Ashwin N

    2014-06-01

    Clostridium difficile infection (CDI) is an important cause of morbidity and healthcare costs, and is characterized by high rates of disease recurrence. The cost-effectiveness of newer treatments for recurrent CDI has not been examined, yet would be important to inform clinical practice. The aim of this study was to analyze the cost effectiveness of competing strategies for recurrent CDI. We constructed a decision-analytic model comparing 4 treatment strategies for first-line treatment of recurrent CDI in a population with a median age of 65 years: metronidazole, vancomycin, fidaxomicin, and fecal microbiota transplant (FMT). We modeled up to 2 additional recurrences following the initial recurrence. We assumed FMT delivery via colonoscopy as our base case, but conducted sensitivity analyses based on different modes of delivery. Willingness-to-pay threshold was set at $50 000 per quality-adjusted life-year. At our base case estimates, initial treatment of recurrent CDI using FMT colonoscopy was the most cost-effective strategy, with an incremental cost-effectiveness ratio of $17 016 relative to oral vancomycin. Fidaxomicin and metronidazole were both dominated by FMT colonoscopy. On sensitivity analysis, FMT colonoscopy remained the most cost-effective strategy at cure rates >88.4% and CDI recurrence rates cost cost-effectiveness threshold. In clinical settings where FMT is not available or applicable, the preferred strategy appears to be initial treatment with oral vancomycin. In this decision analysis examining treatment strategies for recurrent CDI, we demonstrate that FMT colonoscopy is the most cost-effective initial strategy for management of recurrent CDI.

  9. Different Imaging Strategies in Patients With Possible Basilar Artery Occlusion: Cost-Effectiveness Analysis.

    Science.gov (United States)

    Beyer, Sebastian E; Hunink, Myriam G; Schöberl, Florian; von Baumgarten, Louisa; Petersen, Steffen E; Dichgans, Martin; Janssen, Hendrik; Ertl-Wagner, Birgit; Reiser, Maximilian F; Sommer, Wieland H

    2015-07-01

    This study evaluated the cost-effectiveness of different noninvasive imaging strategies in patients with possible basilar artery occlusion. A Markov decision analytic model was used to evaluate long-term outcomes resulting from strategies using computed tomographic angiography (CTA), magnetic resonance imaging, nonenhanced CT, or duplex ultrasound with intravenous (IV) thrombolysis being administered after positive findings. The analysis was performed from the societal perspective based on US recommendations. Input parameters were derived from the literature. Costs were obtained from United States costing sources and published literature. Outcomes were lifetime costs, quality-adjusted life-years (QALYs), incremental cost-effectiveness ratios, and net monetary benefits, with a willingness-to-pay threshold of $80,000 per QALY. The strategy with the highest net monetary benefit was considered the most cost-effective. Extensive deterministic and probabilistic sensitivity analyses were performed to explore the effect of varying parameter values. In the reference case analysis, CTA dominated all other imaging strategies. CTA yielded 0.02 QALYs more than magnetic resonance imaging and 0.04 QALYs more than duplex ultrasound followed by CTA. At a willingness-to-pay threshold of $80,000 per QALY, CTA yielded the highest net monetary benefits. The probability that CTA is cost-effective was 96% at a willingness-to-pay threshold of $80,000/QALY. Sensitivity analyses showed that duplex ultrasound was cost-effective only for a prior probability of ≤0.02 and that these results were only minimally influenced by duplex ultrasound sensitivity and specificity. Nonenhanced CT and magnetic resonance imaging never became the most cost-effective strategy. Our results suggest that CTA in patients with possible basilar artery occlusion is cost-effective. © 2015 The Authors.

  10. Cost-Effectiveness Analysis of the Self-Management Program for Thai Patients with Metabolic Syndrome.

    Science.gov (United States)

    Sakulsupsiri, Anut; Sakthong, Phantipa; Winit-Watjana, Win

    2016-05-01

    Lifestyle modification programs are partly evaluated for their usefulness. This study aimed to assess the cost-effectiveness and healthy lifestyle persistence of a self-management program (SMP) for patients with metabolic syndrome (MetS) in Thai health care settings. A cost-effectiveness analysis was performed on the basis of an intervention study of 90 patients with MetS randomly allocated to the SMP and control groups. A Markov model with the Difference-in-Difference method was used to predict the lifetime costs from a societal perspective and quality-adjusted life-years (QALYs), of which 95% confidence intervals (CIs) were estimated by bootstrapping. The cost-effectiveness analysis, along with healthy lifestyle persistence, was performed using the discount rate of 3% per annum. Parameter uncertainties were identified using one-way and probabilistic sensitivity analyses. The lifetime costs tended to decrease in both groups. The SMP could save lifetime costs (-2310 baht; 95% CI -5960 to 1400) and gain QALYs (0.0098; 95% CI -0.0003 to 0.0190), compared with ordinary care. The probability of cost-effectiveness was 99.4% from the Monte-Carlo simulation, and the program was deemed cost-effective at dropout rates below 69% per year as determined by the threshold of 160,000 baht per QALY gained. The cost of macrovascular complications was the most influencing variable for the overall incremental cost-effectiveness ratio. The SMP provided by the health care settings is marginally cost-effective, and the persistence results support the implementation of the program to minimize the complications and economic burden of patients with MetS. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  11. Cost-Effectiveness of Endovascular Stroke Therapy: A Patient Subgroup Analysis From a US Healthcare Perspective.

    Science.gov (United States)

    Kunz, Wolfgang G; Hunink, M G Myriam; Sommer, Wieland H; Beyer, Sebastian E; Meinel, Felix G; Dorn, Franziska; Wirth, Stefan; Reiser, Maximilian F; Ertl-Wagner, Birgit; Thierfelder, Kolja M

    2016-11-01

    Endovascular therapy in addition to standard care (EVT+SC) has been demonstrated to be more effective than SC in acute ischemic large vessel occlusion stroke. Our aim was to determine the cost-effectiveness of EVT+SC depending on patients' initial National Institutes of Health Stroke Scale (NIHSS) score, time from symptom onset, Alberta Stroke Program Early CT Score (ASPECTS), and occlusion location. A decision model based on Markov simulations estimated lifetime costs and quality-adjusted life years (QALYs) associated with both strategies applied in a US setting. Model input parameters were obtained from the literature, including recently pooled outcome data of 5 randomized controlled trials (ESCAPE [Endovascular Treatment for Small Core and Proximal Occlusion Ischemic Stroke], EXTEND-IA [Extending the Time for Thrombolysis in Emergency Neurological Deficits-Intra-Arterial], MR CLEAN [Multicenter Randomized Clinical Trial of Endovascular Treatment for Acute Ischemic Stroke in the Netherlands], REVASCAT [Randomized Trial of Revascularization With Solitaire FR Device Versus Best Medical Therapy in the Treatment of Acute Stroke Due to Anterior Circulation Large Vessel Occlusion Presenting Within 8 Hours of Symptom Onset], and SWIFT PRIME [Solitaire With the Intention for Thrombectomy as Primary Endovascular Treatment]). Probabilistic sensitivity analysis was performed to estimate uncertainty of the model results. Net monetary benefits, incremental costs, incremental effectiveness, and incremental cost-effectiveness ratios were derived from the probabilistic sensitivity analysis. The willingness-to-pay was set to $50 000/QALY. Overall, EVT+SC was cost-effective compared with SC (incremental cost: $4938, incremental effectiveness: 1.59 QALYs, and incremental cost-effectiveness ratio: $3110/QALY) in 100% of simulations. In all patient subgroups, EVT+SC led to gained QALYs (range: 0.47-2.12), and mean incremental cost-effectiveness ratios were considered cost-effective

  12. The cost effectiveness of pandemic influenza interventions: a pandemic severity based analysis.

    Directory of Open Access Journals (Sweden)

    George J Milne

    Full Text Available BACKGROUND: The impact of a newly emerged influenza pandemic will depend on its transmissibility and severity. Understanding how these pandemic features impact on the effectiveness and cost effectiveness of alternative intervention strategies is important for pandemic planning. METHODS: A cost effectiveness analysis of a comprehensive range of social distancing and antiviral drug strategies intended to mitigate a future pandemic was conducted using a simulation model of a community of ∼30,000 in Australia. Six pandemic severity categories were defined based on case fatality ratio (CFR, using data from the 2009/2010 pandemic to relate hospitalisation rates to CFR. RESULTS: Intervention strategies combining school closure with antiviral treatment and prophylaxis are the most cost effective strategies in terms of cost per life year saved (LYS for all severity categories. The cost component in the cost per LYS ratio varies depending on pandemic severity: for a severe pandemic (CFR of 2.5% the cost is ∼$9 k per LYS; for a low severity pandemic (CFR of 0.1% this strategy costs ∼$58 k per LYS; for a pandemic with very low severity similar to the 2009 pandemic (CFR of 0.03% the cost is ∼$155 per LYS. With high severity pandemics (CFR >0.75% the most effective attack rate reduction strategies are also the most cost effective. During low severity pandemics costs are dominated by productivity losses due to illness and social distancing interventions, while for high severity pandemics costs are dominated by hospitalisation costs and productivity losses due to death. CONCLUSIONS: The most cost effective strategies for mitigating an influenza pandemic involve combining sustained social distancing with the use of antiviral agents. For low severity pandemics the most cost effective strategies involve antiviral treatment, prophylaxis and short durations of school closure; while these are cost effective they are less effective than other strategies in

  13. The Cost Effectiveness of Pandemic Influenza Interventions: A Pandemic Severity Based Analysis

    Science.gov (United States)

    Milne, George J.; Halder, Nilimesh; Kelso, Joel K.

    2013-01-01

    Background The impact of a newly emerged influenza pandemic will depend on its transmissibility and severity. Understanding how these pandemic features impact on the effectiveness and cost effectiveness of alternative intervention strategies is important for pandemic planning. Methods A cost effectiveness analysis of a comprehensive range of social distancing and antiviral drug strategies intended to mitigate a future pandemic was conducted using a simulation model of a community of ∼30,000 in Australia. Six pandemic severity categories were defined based on case fatality ratio (CFR), using data from the 2009/2010 pandemic to relate hospitalisation rates to CFR. Results Intervention strategies combining school closure with antiviral treatment and prophylaxis are the most cost effective strategies in terms of cost per life year saved (LYS) for all severity categories. The cost component in the cost per LYS ratio varies depending on pandemic severity: for a severe pandemic (CFR of 2.5%) the cost is ∼$9 k per LYS; for a low severity pandemic (CFR of 0.1%) this strategy costs ∼$58 k per LYS; for a pandemic with very low severity similar to the 2009 pandemic (CFR of 0.03%) the cost is ∼$155 per LYS. With high severity pandemics (CFR >0.75%) the most effective attack rate reduction strategies are also the most cost effective. During low severity pandemics costs are dominated by productivity losses due to illness and social distancing interventions, while for high severity pandemics costs are dominated by hospitalisation costs and productivity losses due to death. Conclusions The most cost effective strategies for mitigating an influenza pandemic involve combining sustained social distancing with the use of antiviral agents. For low severity pandemics the most cost effective strategies involve antiviral treatment, prophylaxis and short durations of school closure; while these are cost effective they are less effective than other strategies in reducing the

  14. A cost-effectiveness threshold analysis of a multidisciplinary structured educational intervention in pediatric asthma.

    Science.gov (United States)

    Rodriguez-Martinez, Carlos E; Sossa-Briceño, Monica P; Castro-Rodriguez, Jose A

    2018-05-01

    Asthma educational interventions have been shown to improve several clinically and economically important outcomes. However, these interventions are costly in themselves and could lead to even higher disease costs. A cost-effectiveness threshold analysis would be helpful in determining the threshold value of the cost of educational interventions, leading to these interventions being cost-effective. The aim of the present study was to perform a cost-effectiveness threshold analysis to determine the level at which the cost of a pediatric asthma educational intervention would be cost-effective and cost-saving. A Markov-type model was developed in order to estimate costs and health outcomes of a simulated cohort of pediatric patients with persistent asthma treated over a 12-month period. Effectiveness parameters were obtained from a single uncontrolled before-and-after study performed with Colombian asthmatic children. Cost data were obtained from official databases provided by the Colombian Ministry of Health. The main outcome was the variable "quality-adjusted life-years" (QALYs). A deterministic threshold sensitivity analysis showed that the asthma educational intervention will be cost-saving to the health system if its cost is under US$513.20. Additionally, the analysis showed that the cost of the intervention would have to be below US$967.40 in order to be cost-effective. This study identified the level at which the cost of a pediatric asthma educational intervention will be cost-effective and cost-saving for the health system in Colombia. Our findings could be a useful aid for decision makers in efficiently allocating limited resources when planning asthma educational interventions for pediatric patients.

  15. Cost-effectiveness analysis of implants versus autologous perforator flaps using the BREAST-Q.

    Science.gov (United States)

    Matros, Evan; Albornoz, Claudia R; Razdan, Shantanu N; Mehrara, Babak J; Macadam, Sheina A; Ro, Teresa; McCarthy, Colleen M; Disa, Joseph J; Cordeiro, Peter G; Pusic, Andrea L

    2015-04-01

    Reimbursement has been recognized as a physician barrier to autologous reconstruction. Autologous reconstructions are more expensive than prosthetic reconstructions, but provide greater health-related quality of life. The authors' hypothesis is that autologous tissue reconstructions are cost-effective compared with prosthetic techniques when considering health-related quality of life and patient satisfaction. A cost-effectiveness analysis from the payer perspective, including patient input, was performed for unilateral and bilateral reconstructions with deep inferior epigastric perforator (DIEP) flaps and implants. The effectiveness measure was derived using the BREAST-Q and interpreted as the cost for obtaining 1 year of perfect breast health-related quality-adjusted life-year. Costs were obtained from the 2010 Nationwide Inpatient Sample. The incremental cost-effectiveness ratio was generated. A sensitivity analysis for age and stage at diagnosis was performed. BREAST-Q scores from 309 patients with implants and 217 DIEP flap reconstructions were included. The additional cost for obtaining 1 year of perfect breast-related health for a unilateral DIEP flap compared with implant reconstruction was $11,941. For bilateral DIEP flaps compared with implant reconstructions, the cost for an additional breast health-related quality-adjusted life-year was $28,017. The sensitivity analysis demonstrated that the cost for an additional breast health-related quality-adjusted life-year for DIEP flaps compared with implants was less for younger patients and earlier stage breast cancer. DIEP flaps are cost-effective compared with implants, especially for unilateral reconstructions. Cost-effectiveness of autologous techniques is maximized in women with longer life expectancy. Patient-reported outcomes findings can be incorporated into cost-effectiveness analyses to demonstrate the relative value of reconstructive procedures.

  16. Scaling-up essential neuropsychiatric services in Ethiopia: a cost-effectiveness analysis.

    Science.gov (United States)

    Strand, Kirsten Bjerkreim; Chisholm, Dan; Fekadu, Abebaw; Johansson, Kjell Arne

    2016-05-01

    There is an immense need for scaling-up neuropsychiatric care in low-income countries. Contextualized cost-effectiveness analyses (CEAs) provide relevant information for local policies. The aim of this study is to perform a contextualized CEA of neuropsychiatric interventions in Ethiopia and to illustrate expected population health and budget impacts across neuropsychiatric disorders. A mathematical population model (PopMod) was used to estimate intervention costs and effectiveness. Existing variables from a previous WHO-CHOICE regional CEA model were substantially revised. Treatments for depression, schizophrenia, bipolar disorder and epilepsy were analysed. The best available local data on epidemiology, intervention efficacy, current and target coverage, resource prices and salaries were used. Data were obtained from expert opinion, local hospital information systems, the Ministry of Health and literature reviews. Treatment of epilepsy with a first generation antiepileptic drug is the most cost-effective treatment (US$ 321 per DALY adverted). Treatments for depression have mid-range values compared with other interventions (US$ 457-1026 per DALY adverted). Treatments for schizophrenia and bipolar disorders are least cost-effective (US$ 1168-3739 per DALY adverted). This analysis gives the Ethiopian government a comprehensive overview of the expected costs, effectiveness and cost-effectiveness of introducing basic neuropsychiatric interventions. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  17. Cost-effectiveness analysis of cochlear dose reduction by proton beam therapy for medulloblastoma in childhood

    International Nuclear Information System (INIS)

    Hirano, Emi; Kawabuchi, Koichi; Fuji, Hiroshi; Onoe, Tsuyoshi; Kumar, Vinay; Shirato, Hiroki

    2014-01-01

    The aim of this study is to evaluate the cost-effectiveness of proton beam therapy with cochlear dose reduction compared with conventional X-ray radiotherapy for medulloblastoma in childhood. We developed a Markov model to describe health states of 6-year-old children with medulloblastoma after treatment with proton or X-ray radiotherapy. The risks of hearing loss were calculated on cochlear dose for each treatment. Three types of health-related quality of life (HRQOL) of EQ-5D, HUI3 and SF-6D were used for estimation of quality-adjusted life years (QALYs). The incremental cost-effectiveness ratio (ICER) for proton beam therapy compared with X-ray radiotherapy was calculated for each HRQOL. Sensitivity analyses were performed to model uncertainty in these parameters. The ICER for EQ-5D, HUI3 and SF-6D were $21 716/QALY, $11 773/QALY, and $20 150/QALY, respectively. One-way sensitivity analyses found that the results were sensitive to discount rate, the risk of hearing loss after proton therapy, and costs of proton irradiation. Cost-effectiveness acceptability curve analysis revealed a 99% probability of proton therapy being cost effective at a societal willingness-to-pay value. Proton beam therapy with cochlear dose reduction improves health outcomes at a cost that is within the acceptable cost-effectiveness range from the payer's standpoint. (author)

  18. Beyond cost-effectiveness: Using systems analysis for infectious disease preparedness.

    Science.gov (United States)

    Phelps, Charles; Madhavan, Guruprasad; Rappuoli, Rino; Colwell, Rita; Fineberg, Harvey

    2017-01-20

    Until the recent outbreaks, Ebola vaccines ranked low in decision makers' priority lists based on cost-effectiveness analysis and (or) corporate profitability. Despite a relatively small number of Ebola-related cases and deaths (compared to other causes), Ebola vaccines suddenly leapt to highest priority among international health agencies and vaccine developers. Clearly, earlier cost-effectiveness analyses badly missed some factors affecting real world decisions. Multi-criteria systems analysis can improve evaluation and prioritization of vaccine development and also of many other health policy and investment decisions. Neither cost-effectiveness nor cost-benefit analysis can capture important aspects of problems such as Ebola or the emerging threat of Zika, especially issues of inequality and disparity-issues that dominate the planning of many global health and economic organizations. Cost-benefit analysis requires assumptions about the specific value of life-an idea objectionable to many analysts and policy makers. Additionally, standard cost-effectiveness calculations cannot generally capture effects on people uninfected with Ebola for example, but nevertheless affected through such factors as contagion, herd immunity, and fear of dread disease, reduction of travel and commerce, and even the hope of disease eradication. Using SMART Vaccines, we demonstrate how systems analysis can visibly include important "other factors" and more usefully guide decision making and beneficially alter priority setting processes. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. High-throughput genotyping for species identification and diversity assessment in germplasm collections.

    Science.gov (United States)

    Mason, Annaliese S; Zhang, Jing; Tollenaere, Reece; Vasquez Teuber, Paula; Dalton-Morgan, Jessica; Hu, Liyong; Yan, Guijun; Edwards, David; Redden, Robert; Batley, Jacqueline

    2015-09-01

    Germplasm collections provide an extremely valuable resource for breeders and researchers. However, misclassification of accessions by species often hinders the effective use of these collections. We propose that use of high-throughput genotyping tools can provide a fast, efficient and cost-effective way of confirming species in germplasm collections, as well as providing valuable genetic diversity data. We genotyped 180 Brassicaceae samples sourced from the Australian Grains Genebank across the recently released Illumina Infinium Brassica 60K SNP array. Of these, 76 were provided on the basis of suspected misclassification and another 104 were sourced independently from the germplasm collection. Presence of the A- and C-genomes combined with principle components analysis clearly separated Brassica rapa, B. oleracea, B. napus, B. carinata and B. juncea samples into distinct species groups. Several lines were further validated using chromosome counts. Overall, 18% of samples (32/180) were misclassified on the basis of species. Within these 180 samples, 23/76 (30%) supplied on the basis of suspected misclassification were misclassified, and 9/105 (9%) of the samples randomly sourced from the Australian Grains Genebank were misclassified. Surprisingly, several individuals were also found to be the product of interspecific hybridization events. The SNP (single nucleotide polymorphism) array proved effective at confirming species, and provided useful information related to genetic diversity. As similar genomic resources become available for different crops, high-throughput molecular genotyping will offer an efficient and cost-effective method to screen germplasm collections worldwide, facilitating more effective use of these valuable resources by breeders and researchers. © 2015 John Wiley & Sons Ltd.

  20. Cost-effectiveness analysis of a statewide media campaign to promote adolescent physical activity.

    Science.gov (United States)

    Peterson, Michael; Chandlee, Margaret; Abraham, Avron

    2008-10-01

    A cost-effectiveness analysis of a statewide social marketing campaign was performed using a statewide surveillance survey distributed to 6th through 12th graders, media production and placement costs, and 2000 census data. Exposure to all three advertisements had the highest impact on both intent and behavior with 65.6% of the respondents considering becoming more active and 58.3% reporting becoming more active. Average cost of the entire campaign was $4.01 per person to see an ad, $7.35 per person to consider being more active, and $8.87 per person to actually become more active, with billboards yielding the most positive cost-effectiveness. Findings highlight market research as an essential part of social marketing campaigns and the importance of using multiple marketing modalities to enhance cost-effectiveness and impact.

  1. High throughput analysis of red wine and grape phenolics-adaptation and validation of methyl cellulose precipitable tannin assay and modified Somers color assay to a rapid 96 well plate format.

    Science.gov (United States)

    Mercurio, Meagan D; Dambergs, Robert G; Herderich, Markus J; Smith, Paul A

    2007-06-13

    The methyl cellulose precipitable (MCP) tannin assay and a modified version of the Somers and Evans color assay were adapted to high-throughput (HTP) analysis. To improve efficiency of the MCP tannin assay, a miniaturized 1 mL format and a HTP format using 96 well plates were developed. The Somers color assay was modified to allow the standardization of pH and ethanol concentrations of wine samples in a simple one-step dilution with a buffer solution, thus removing inconsistencies between wine matrices prior to analysis and allowing for its adaptation to a HTP format. Validation studies showed that all new formats were efficient, and results were reproducible and analogous to the original formats.

  2. Cost-effectiveness analysis of neonatal hearing screening program in china: should universal screening be prioritized?

    Directory of Open Access Journals (Sweden)

    Huang Li-Hui

    2012-04-01

    Full Text Available Abstract Background Neonatal hearing screening (NHS has been routinely offered as a vital component of early childhood care in developed countries, whereas such a screening program is still at the pilot or preliminary stage as regards its nationwide implementation in developing countries. To provide significant evidence for health policy making in China, this study aims to determine the cost-effectiveness of NHS program implementation in case of eight provinces of China. Methods A cost-effectiveness model was conducted and all neonates annually born from 2007 to 2009 in eight provinces of China were simulated in this model. The model parameters were estimated from the established databases in the general hospitals or maternal and child health hospitals of these eight provinces, supplemented from the published literature. The model estimated changes in program implementation costs, disability-adjusted life years (DALYs, average cost-effectiveness ratio (ACER, and incremental cost-effectiveness ratio (ICER for universal screening compared to targeted screening in eight provinces. Results and discussion A multivariate sensitivity analysis was performed to determine uncertainty in health effect estimates and cost-effectiveness ratios using a probabilistic modeling technique. Targeted strategy trended to be cost-effective in Guangxi, Jiangxi, Henan, Guangdong, Zhejiang, Hebei, Shandong, and Beijing from the level of 9%, 9%, 8%, 4%, 3%, 7%, 5%, and 2%, respectively; while universal strategy trended to be cost-effective in those provinces from the level of 70%, 70%, 48%, 10%, 8%, 28%, 15%, 4%, respectively. This study showed although there was a huge disparity in the implementation of the NHS program in the surveyed provinces, both universal strategy and targeted strategy showed cost-effectiveness in those relatively developed provinces, while neither of the screening strategy showed cost-effectiveness in those relatively developing provinces. This

  3. Guiding the Development and Use of Cost-Effectiveness Analysis in Education

    Science.gov (United States)

    Levin, Henry M.; Belfield, Clive

    2015-01-01

    Cost-effectiveness analysis is rarely used in education. When it is used, it often fails to meet methodological standards, especially with regard to cost measurement. Although there are occasional criticisms of these failings, we believe that it is useful to provide a listing of the more common concerns and how they might be addressed. Based upon…

  4. Cost-effectiveness analysis for sector-wide priority setting in health

    NARCIS (Netherlands)

    R.C.W. Hutubessy (Raymond)

    2003-01-01

    textabstractCost-effectiveness analysis (CEA) provides one means by which decision-makers may assess and potentially improve the performance of health systems. The process can help to ensure that resources devoted to health systems are achieving the maximum possible benefit in terms of outcomes

  5. Cost effectiveness analysis of strategies to combat HIV/AIDS in developing countries.

    NARCIS (Netherlands)

    Hogan, D.R.; Baltussen, R.M.P.M.; Hayashi, C.; Lauer, J.A.; Salomon, J.A.

    2005-01-01

    OBJECTIVE: To assess the costs and health effects of a range of interventions for preventing the spread of HIV and for treating people with HIV/AIDS in the context of the millennium development goal for combating HIV/AIDS. DESIGN: Cost effectiveness analysis based on an epidemiological model.

  6. Cost-effectiveness analysis of risk reduction at nuclear power plants

    International Nuclear Information System (INIS)

    Lochard, J.; Maccia, C.; Pages, P.

    1985-01-01

    Cost-effectiveness analysis of risk reduction is now widely accepted as a rational analytical framework to consistently address the resource allocation problem underlying any risk management process. This paper presents how this technique can be usefully applied to complex systems such as the management of radioactive releases from nuclear power plants into the environment. (orig.) [de

  7. Cost-effectiveness Analysis of Rivaroxaban in the Secondary Prevention of Acute Coronary Syndromes in Sweden

    NARCIS (Netherlands)

    Begum, N.; Stephens, S.; Schoeman, O.; Fraschke, A.; Kirsch, B.; Briere, J.B.; Verheugt, F.W.A.; Hout, B.A. van

    2015-01-01

    BACKGROUND: Worldwide, coronary heart disease accounts for 7 million deaths each year. In Sweden, acute coronary syndrome (ACS) is a leading cause of hospitalization and is responsible for 1 in 4 deaths. OBJECTIVE: The aim of this analysis was to assess the cost-effectiveness of rivaroxaban 2.5 mg

  8. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  9. Cost-Effectiveness Analysis of Different Genetic Testing Strategies for Lynch Syndrome in Taiwan.

    Directory of Open Access Journals (Sweden)

    Ying-Erh Chen

    Full Text Available Patients with Lynch syndrome (LS have a significantly increased risk of developing colorectal cancer (CRC and other cancers. Genetic screening for LS among patients with newly diagnosed CRC aims to identify mutations in the disease-causing genes (i.e., the DNA mismatch repair genes in the patients, to offer genetic testing for relatives of the patients with the mutations, and then to provide early prevention for the relatives with the mutations. Several genetic tests are available for LS, such as DNA sequencing for MMR genes and tumor testing using microsatellite instability and immunohistochemical analyses. Cost-effectiveness analyses of different genetic testing strategies for LS have been performed in several studies from different countries such as the US and Germany. However, a cost-effectiveness analysis for the testing has not yet been performed in Taiwan. In this study, we evaluated the cost-effectiveness of four genetic testing strategies for LS described in previous studies, while population-specific parameters, such as the mutation rates of the DNA mismatch repair genes and treatment costs for CRC in Taiwan, were used. The incremental cost-effectiveness ratios based on discounted life years gained due to genetic screening were calculated for the strategies relative to no screening and to the previous strategy. Using the World Health Organization standard, which was defined based on Taiwan's Gross Domestic Product per capita, the strategy based on immunohistochemistry as a genetic test followed by BRAF mutation testing was considered to be highly cost-effective relative to no screening. Our probabilistic sensitivity analysis results also suggest that the strategy has a probability of 0.939 of being cost-effective relative to no screening based on the commonly used threshold of $50,000 to determine cost-effectiveness. To the best of our knowledge, this is the first cost-effectiveness analysis for evaluating different genetic testing

  10. A gas trapping method for high-throughput metabolic experiments.

    Science.gov (United States)

    Krycer, James R; Diskin, Ciana; Nelson, Marin E; Zeng, Xiao-Yi; Fazakerley, Daniel J; James, David E

    2018-01-01

    Research into cellular metabolism has become more high-throughput, with typical cell-culture experiments being performed in multiwell plates (microplates). This format presents a challenge when trying to collect gaseous products, such as carbon dioxide (CO2), which requires a sealed environment and a vessel separate from the biological sample. To address this limitation, we developed a gas trapping protocol using perforated plastic lids in sealed cell-culture multiwell plates. We used this trap design to measure CO2 production from glucose and fatty acid metabolism, as well as hydrogen sulfide production from cysteine-treated cells. Our data clearly show that this gas trap can be applied to liquid and solid gas-collection media and can be used to study gaseous product generation by both adherent cells and cells in suspension. Since our gas traps can be adapted to multiwell plates of various sizes, they present a convenient, cost-effective solution that can accommodate the trend toward high-throughput measurements in metabolic research.

  11. Using cost-effectiveness analysis for formulary decision making: from theory into practice.

    Science.gov (United States)

    Detsky, A S

    1994-10-01

    The growth of expenditures on healthcare and pharmaceutical products is a concern to third-party payers because of the absence of market discipline (price signals that consumers face). Cost-effectiveness analysis is a method that allows third-party payers to systematically make judgements about the 'value for money' of these products. It moves beyond simple unit price comparisons of alternate interventions/products to consider the full stream of relevant cost and benefits. As formulary committees begin to adopt the systematic use of cost-effectiveness analyses to inform the debate, the exercise will move from an academic to a more practical application. This transition will require several important changes including defining the purpose of cost-effectiveness analysis, measurement of outcomes and data, format of reports, and contractual arrangements between the pharmaceutical industry and analysts. As more 'real world' experience is gained in the practical application of cost-effectiveness analysis, the quality of data will improve as will its value as an aid to decision making.

  12. COST-EFFECTIVENESS ANALYSIS OF OUTSOURCING SERVICES IN REGARD TO Foreign ECONOMIC ACTIVITY OF ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Evgenia Sergeevna Gracheva

    2017-06-01

    Full Text Available Despite enterprises, which render outsourcing services, appeared on the Russian services market with the beginning of market relations, there are not many researches that deal with competition and cost-effectiveness analyses of outsourcing services in regard to foreign economic activity. Economic integration of Russian business into international economic relations leads to complication of all national foreign economic complex and to the necessity of international economic infrastructure development. One of its most important parts are both services which deal with execution of support international economic operations (interpreting and translation services, transport services, customs services etc. and conducting foreign economic activity for client-enterprise (complex outsourcing FEA. Welcoming environment is formed nowadays for outsourcing business development in regard to foreign economic activity. It dictates the need for more thorough study of this type of business activity and development of indicators system for cost-effectiveness analysis of outsourcing in regard to foreign economic activity. Purpose – to define the complex outsourcing FEA, to develop the indicators system for cost-effectiveness analysis of outsourcing services in regard to foreign economic activity. Methodology: in article following scientific methods are used: functional method and statistical method. Results: is given authorial definition of complex outsourcing FEA, is developed the indicators system for cost-effectiveness analysis of outsourcing. Practical implications: the results of this research may be used by the businesses, which render outsourcing and intermediary services in regard to foreign economic activity.

  13. Cost-effectiveness analysis of endoscopic tympanoplasty versus microscopic tympanoplasty for chronic otitis media in Taiwan.

    Science.gov (United States)

    Tseng, Chih-Chieh; Lai, Ming-Tang; Wu, Chia-Che; Yuan, Sheng-Po; Ding, Yi-Fang

    2018-03-01

    Health care systems and physicians need to conform to budgets and streamline resources to provide cost-effective quality care. Although endoscopic tympanoplasty (ET) has been performed for decades, no studies on the cost-effectiveness of ET and microscopic tympanoplasty (MT) for treating chronic otitis media have been published. The present study aimed to compare the cost-effectiveness of ET and MT for treating chronic otitis media. This study was performed using a Cohort-style Markov decision-tree economic model with a 30-year time horizon. The economic perspective was that of a third-party payer (Taiwan National Health Insurance System). Two treatment strategies were compared, namely ET and MT. The primary outcome was the incremental cost per quality-adjusted life year (QALY). Probabilities were obtained from meta-analyses. Costs were obtained from the published literature and Taiwan National Health Insurance System database. Multiple sensitivity analyses were performed to account for data uncertainty. The reference case revealed that the total cost of ET was $NT 20,901 for 17.08 QALY per patient. By contrast, the total cost of MT was $NT 21,171 for 17.15 QALY per patient. The incremental cost effectiveness ratio for ET versus that of MT was $NT 3703 per QALY. The cost-effectiveness acceptability curve indicated that ET was comparable to MT at a willingness-to-pay threshold of larger than $NT 35,000 per QALY. This cost-effectiveness analysis indicates that ET is comparable to MT for treating chronic otitis media in Taiwan. This result provides the latest information for physicians, the government, and third-party payers to select proper clinical practice. Copyright © 2017. Published by Elsevier Taiwan LLC.

  14. Cost-Effectiveness Analysis of Regorafenib for Gastrointestinal Stromal Tumour (GIST) in Germany.

    Science.gov (United States)

    Tamoschus, David; Draexler, Katja; Chang, Jane; Ngai, Christopher; Madin-Warburton, Matthew; Pitcher, Ashley

    2017-06-01

    No study has compared the cost-effectiveness of active treatment options for unresectable or metastatic gastrointestinal stromal tumours in patients who progressed on or are intolerant to prior treatment with imatinib and sunitinib. The aim of this study was to estimate the cost-effectiveness of regorafenib compared to imatinib rechallenge in this setting in Germany. Hazard ratios for progression-free (PFS) and overall survival (OS) with regorafenib versus imatinib rechallenge were estimated by indirect comparison. A state distribution model was used to simulate progression, mortality and treatment costs over a lifetime horizon. Drug acquisition costs and utilities were derived from clinical trial data and published literature; non-drug costs were not included. The outcomes measured were treatment costs, life-years (LYs) and quality-adjusted life-years (QALYs). The indirect comparison suggested that median PFS and OS were longer with regorafenib compared to imatinib but results were not statistically significant. Regorafenib versus imatinib rechallenge was estimated to have hazard ratios of 0.58 (95% CI 0.31-1.11) for PFS and 0.77 (95% CI 0.34-1.77) for OS, with substantial uncertainty due to the rarity of the disease and small number of patients within the trials. Regorafenib treatment per patient over a lifetime horizon provided an additional 0.61 LYs and 0.42 QALYs over imatinib rechallenge, with additional direct drug costs of €8,773. The incremental cost-effectiveness ratio was €21,127 per QALY gained. At a cost-effectiveness threshold of €50,000 per QALY, regorafenib had a 67% probability of being cost-effective. Based on the currently available clinical data, this analysis suggests that regorafenib is cost-effective compared with imatinib rechallenge in Germany.

  15. A cost-effectiveness analysis of hormone replacement therapy in the menopause.

    Science.gov (United States)

    Cheung, A P; Wren, B G

    1992-03-02

    To evaluate the cost-effectiveness of hormone replacement therapy in the menopause with particular reference to osteoporotic fracture and myocardial infarction. The multiple-decrement form of the life table was the mathematical model used to follow women of age 50 through their lifetime under the "no hormone replacement" and "hormone replacement" assumptions. Standard demographic and health economic techniques were used to calculate the corresponding lifetime differences in direct health care costs (net costs in dollars) and health effects ("net effectiveness" in terms of life expectancy and quality, in "quality-adjusted life-years"). This was then expressed as a cost-effectiveness ratio or the cost ($) per quality-adjusted life-year (QALY) for each of the chosen hormone replacement regimens. All women of age 50 in New South Wales, Australia (n = 27,021). The analysis showed that the lifetime net increments in direct medical care costs were largely contributed by hormone drug and consultation costs. Hormone replacement was associated with increased quality-adjusted life expectancy, a large percentage of which was attributed to a relief of menopausal symptoms. Cost-effectiveness ratios ranged from under 10,000 to over a million dollars per QALY. Factors associated with improved cost-effectiveness were prolonged treatment duration, the presence of menopausal symptoms, minimum progestogen side effects (in the case of oestrogen with progestogen regimens), oestrogen use after hysterectomy and the inclusion of cardiac benefits in addition to fracture prevention. Hormone replacement therapy for symptomatic women is cost-effective when factors that enhance its efficiency are considered. Short-term treatment of asymptomatic women for prevention of osteoporotic fractures and myocardial infarction is an inefficient use of health resources. Cost-effectiveness of hormone replacement in asymptomatic women is dependent on the magnitude of cardiac benefits associated with hormone

  16. Cost-effectiveness analysis of ultrasonography screening for nonalcoholic fatty liver disease in metabolic syndrome patients

    Science.gov (United States)

    Phisalprapa, Pochamana; Supakankunti, Siripen; Charatcharoenwitthaya, Phunchai; Apisarnthanarak, Piyaporn; Charoensak, Aphinya; Washirasaksiri, Chaiwat; Srivanichakorn, Weerachai; Chaiyakunapruk, Nathorn

    2017-01-01

    Abstract Background: Nonalcoholic fatty liver disease (NAFLD) can be diagnosed early by noninvasive ultrasonography; however, the cost-effectiveness of ultrasonography screening with intensive weight reduction program in metabolic syndrome patients is not clear. This study aims to estimate economic and clinical outcomes of ultrasonography in Thailand. Methods: Cost-effectiveness analysis used decision tree and Markov models to estimate lifetime costs and health benefits from societal perspective, based on a cohort of 509 metabolic syndrome patients in Thailand. Data were obtained from published literatures and Thai database. Results were reported as incremental cost-effectiveness ratios (ICERs) in 2014 US dollars (USD) per quality-adjusted life year (QALY) gained with discount rate of 3%. Sensitivity analyses were performed to assess the influence of parameter uncertainty on the results. Results: The ICER of ultrasonography screening of 50-year-old metabolic syndrome patients with intensive weight reduction program was 958 USD/QALY gained when compared with no screening. The probability of being cost-effective was 67% using willingness-to-pay threshold in Thailand (4848 USD/QALY gained). Screening before 45 years was cost saving while screening at 45 to 64 years was cost-effective. Conclusions: For patients with metabolic syndromes, ultrasonography screening for NAFLD with intensive weight reduction program is a cost-effective program in Thailand. Study can be used as part of evidence-informed decision making. Translational Impacts: Findings could contribute to changes of NAFLD diagnosis practice in settings where economic evidence is used as part of decision-making process. Furthermore, study design, model structure, and input parameters could also be used for future research addressing similar questions. PMID:28445256

  17. Multicapillary SDS-gel electrophoresis for the analysis of fluorescently labeled mAb preparations: a high throughput quality control process for the production of QuantiPlasma and PlasmaScan mAb libraries.

    Science.gov (United States)

    Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László

    2014-08-01

    Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A cost-effectiveness analysis of two different repositioning strategies for the prevention of pressure ulcers.

    Science.gov (United States)

    Marsden, Grace; Jones, Katie; Neilson, Julie; Avital, Liz; Collier, Mark; Stansby, Gerard

    2015-12-01

    To assess the cost effectiveness of two repositioning strategies and inform the 2014 National Institute for Health and Care Excellence clinical guideline recommendations on pressure ulcer prevention. Pressure ulcers are distressing events, caused when skin and underlying tissues are placed under pressure sufficient to impair blood supply. They can have a substantial impact on quality of life and have significant resource implications. Repositioning is a key prevention strategy, but can be resource intensive, leading to variation in practice. This economic analysis was conducted to identify the most cost-effective repositioning strategy for the prevention of pressure ulcers. The economic analysis took the form of a cost-utility model. The clinical inputs to the model were taken from a systematic review of clinical data. The population in the model was older people in a nursing home. The economic model was developed with members of the guideline development group and included costs borne by the UK National Health Service. Outcomes were expressed as costs and quality adjusted life years. Despite being marginally more clinically effective, alternating 2 and 4 hourly repositioning is not a cost-effective use of UK National Health Service resources (compared with 4 hourly repositioning) for this high risk group of patients at a cost-effectiveness threshold of £20,000 per quality adjusted life years. These results were used to inform the clinical guideline recommendations for those who are at high risk of developing pressure ulcers. © 2015 John Wiley & Sons Ltd.

  19. Cost-effectiveness analysis of thermotherapy versus pentavalent antimonials for the treatment of cutaneous leishmaniasis.

    Science.gov (United States)

    Cardona-Arias, Jaiberth Antonio; López-Carvajal, Liliana; Tamayo Plata, Mery Patricia; Vélez, Iván Darío

    2017-05-01

    The treatment of cutaneous leishmaniasis is toxic, has contraindications, and a high cost. The objective of this study was to estimate the cost-effectiveness of thermotherapy versus pentavalent antimonials for the treatment of cutaneous leishmaniasis. Effectiveness was the proportion of healing and safety with the adverse effects; these parameters were estimated from a controlled clinical trial and a meta-analysis. A standard costing was conducted. Average and incremental cost-effectiveness ratios were estimated. The uncertainty regarding effectiveness, safety, and costs was determined through sensitivity analyses. The total costs were $66,807 with Glucantime and $14,079 with thermotherapy. The therapeutic effectiveness rates were 64.2% for thermotherapy and 85.1% for Glucantime. The average cost-effectiveness ratios ranged between $721 and $1275 for Glucantime and between $187 and $390 for thermotherapy. Based on the meta-analysis, thermotherapy may be a dominant strategy. The excellent cost-effectiveness ratio of thermotherapy shows the relevance of its inclusion in guidelines for the treatment. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  20. Cost-Effectiveness Analysis of Helicobacter pylori Diagnostic Methods in Patients with Atrophic Gastritis

    Directory of Open Access Journals (Sweden)

    Fumio Omata

    2017-01-01

    Full Text Available Background. There are several diagnostic methods for Helicobacter pylori (H. pylori infection. A cost-effective analysis is needed to decide on the optimal diagnostic method. The aim of this study was to determine a cost-effective diagnostic method in patients with atrophic gastritis (AG. Methods. A decision-analysis model including seven diagnostic methods was constructed for patients with AG diagnosed by esophagogastroduodenoscopy. Expected values of cost and effectiveness were calculated for each test. Results. If the prevalence of H. pylori in the patients with AG is 85% and CAM-resistant H. pylori is 30%, histology, stool H. pylori antigen (SHPAg, bacterial culture (BC, and urine H. pylori antibody (UHPAb were dominated by serum H. pylori IgG antibody (SHPAb, rapid urease test (RUT, and urea breath test (UBT. Among three undominated methods, the incremental cost-effective ratios (ICER of RUT versus SHPAb and UBT versus RUT were $214 and $1914, respectively. If the prevalence of CAM-sensitive H. pylori was less than 55%, BC was not dominated, but its H. pylori eradication success rate was 0.86. Conclusions. RUT was the most cost-effective at the current prevalence of CAM-resistant H. pylori. BC could not be selected due to its poor effectiveness even if CAM-resistant H. pylori was more than 45%.

  1. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  2. Cost-effectiveness of antibiotics for COPD management: observational analysis using CPRD data

    Directory of Open Access Journals (Sweden)

    Sarah J. Ronaldson

    2017-06-01

    Full Text Available It is often difficult to determine the cause of chronic obstructive pulmonary disease (COPD exacerbations, and antibiotics are frequently prescribed. This study conducted an observational cost-effectiveness analysis of prescribing antibiotics for exacerbations of COPD based on routinely collected data from patient electronic health records. A cohort of 45 375 patients aged 40 years or more who attended their general practice for a COPD exacerbation during 2000–2013 was identified from the Clinical Practice Research Datalink. Two groups were formed (“immediate antibiotics” or “no antibiotics” based on whether antibiotics were prescribed during the index general practice (GP consultation, with data analysed according to subsequent healthcare resource use. A cost-effectiveness analysis was undertaken from the perspective of the UK National Health Service, using a time horizon of 4 weeks in the base case. The use of antibiotics for COPD exacerbations resulted in cost savings and an improvement in all outcomes analysed; i.e. GP visits, hospitalisations, community respiratory team referrals, all referrals, infections and subsequent antibiotics prescriptions were lower for the antibiotics group. Hence, the use of antibiotics was dominant over no antibiotics. The economic analysis suggests that use of antibiotics for COPD exacerbations is a cost-effective alternative to not prescribing antibiotics for patients who present to their GP, and remains cost-effective when longer time horizons of 3 months and 12 months are considered. It would be useful for a definitive trial to be undertaken in this area to determine the cost-effectiveness of antibiotics for COPD exacerbations.

  3. Cost-effectiveness of antibiotics for COPD management: observational analysis using CPRD data.

    Science.gov (United States)

    Ronaldson, Sarah J; Raghunath, Anan; Torgerson, David J; Van Staa, Tjeerd

    2017-04-01

    It is often difficult to determine the cause of chronic obstructive pulmonary disease (COPD) exacerbations, and antibiotics are frequently prescribed. This study conducted an observational cost-effectiveness analysis of prescribing antibiotics for exacerbations of COPD based on routinely collected data from patient electronic health records. A cohort of 45 375 patients aged 40 years or more who attended their general practice for a COPD exacerbation during 2000-2013 was identified from the Clinical Practice Research Datalink. Two groups were formed ("immediate antibiotics" or "no antibiotics") based on whether antibiotics were prescribed during the index general practice (GP) consultation, with data analysed according to subsequent healthcare resource use. A cost-effectiveness analysis was undertaken from the perspective of the UK National Health Service, using a time horizon of 4 weeks in the base case. The use of antibiotics for COPD exacerbations resulted in cost savings and an improvement in all outcomes analysed; i.e. GP visits, hospitalisations, community respiratory team referrals, all referrals, infections and subsequent antibiotics prescriptions were lower for the antibiotics group. Hence, the use of antibiotics was dominant over no antibiotics. The economic analysis suggests that use of antibiotics for COPD exacerbations is a cost-effective alternative to not prescribing antibiotics for patients who present to their GP, and remains cost-effective when longer time horizons of 3 months and 12 months are considered. It would be useful for a definitive trial to be undertaken in this area to determine the cost-effectiveness of antibiotics for COPD exacerbations.

  4. Cost-Effectiveness Analysis of an Automated Medication System Implemented in a Danish Hospital Setting.

    Science.gov (United States)

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    To evaluate the cost-effectiveness of an automated medication system (AMS) implemented in a Danish hospital setting. An economic evaluation was performed alongside a controlled before-and-after effectiveness study with one control ward and one intervention ward. The primary outcome measure was the number of errors in the medication administration process observed prospectively before and after implementation. To determine the difference in proportion of errors after implementation of the AMS, logistic regression was applied with the presence of error(s) as the dependent variable. Time, group, and interaction between time and group were the independent variables. The cost analysis used the hospital perspective with a short-term incremental costing approach. The total 6-month costs with and without the AMS were calculated as well as the incremental costs. The number of avoided administration errors was related to the incremental costs to obtain the cost-effectiveness ratio expressed as the cost per avoided administration error. The AMS resulted in a statistically significant reduction in the proportion of errors in the intervention ward compared with the control ward. The cost analysis showed that the AMS increased the ward's 6-month cost by €16,843. The cost-effectiveness ratio was estimated at €2.01 per avoided administration error, €2.91 per avoided procedural error, and €19.38 per avoided clinical error. The AMS was effective in reducing errors in the medication administration process at a higher overall cost. The cost-effectiveness analysis showed that the AMS was associated with affordable cost-effectiveness rates. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Reconstruction versus conservative treatment after rupture of the anterior cruciate ligament: cost effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Farshad Mazda

    2011-11-01

    Full Text Available Abstract Background The decision whether to treat conservatively or reconstruct surgically a torn anterior cruciate ligament (ACL is an ongoing subject of debate. The high prevalence and associated public health burden of torn ACL has led to continuous efforts to determine the best therapeutic approach. A critical evaluation of benefits and expenditures of both treatment options as in a cost effectiveness analysis seems well-suited to provide valuable information for treating physicians and healthcare policymakers. Methods A literature review identified four of 7410 searched articles providing sufficient outcome probabilities for the two treatment options for modeling. A transformation key based on the expert opinions of 25 orthopedic surgeons was used to derive utilities from available evidence. The cost data for both treatment strategies were based on average figures compiled by Orthopaedic University Hospital Balgrist and reinforced by Swiss national statistics. A decision tree was constructed to derive the cost-effectiveness of each strategy, which was then tested for robustness using Monte Carlo simulation. Results Decision tree analysis revealed a cost effectiveness of 16,038 USD/0.78 QALY for ACL reconstruction and 15,466 USD/0.66 QALY for conservative treatment, implying an incremental cost effectiveness of 4,890 USD/QALY for ACL reconstruction. Sensitivity analysis of utilities did not change the trend. Conclusion ACL reconstruction for reestablishment of knee stability seems cost effective in the Swiss setting based on currently available evidence. This, however, should be reinforced with randomized controlled trials comparing the two treatment strategies.

  6. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    Science.gov (United States)

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  7. Genome-wide generation and use of informative intron-spanning and intron-length polymorphism markers for high-throughput genetic analysis in rice

    Science.gov (United States)

    Badoni, Saurabh; Das, Sweta; Sayal, Yogesh K.; Gopalakrishnan, S.; Singh, Ashok K.; Rao, Atmakuri R.; Agarwal, Pinky; Parida, Swarup K.; Tyagi, Akhilesh K.

    2016-01-01

    We developed genome-wide 84634 ISM (intron-spanning marker) and 16510 InDel-fragment length polymorphism-based ILP (intron-length polymorphism) markers from genes physically mapped on 12 rice chromosomes. These genic markers revealed much higher amplification-efficiency (80%) and polymorphic-potential (66%) among rice accessions even by a cost-effective agarose gel-based assay. A wider level of functional molecular diversity (17–79%) and well-defined precise admixed genetic structure was assayed by 3052 genome-wide markers in a structured population of indica, japonica, aromatic and wild rice. Six major grain weight QTLs (11.9–21.6% phenotypic variation explained) were mapped on five rice chromosomes of a high-density (inter-marker distance: 0.98 cM) genetic linkage map (IR 64 x Sonasal) anchored with 2785 known/candidate gene-derived ISM and ILP markers. The designing of multiple ISM and ILP markers (2 to 4 markers/gene) in an individual gene will broaden the user-preference to select suitable primer combination for efficient assaying of functional allelic variation/diversity and realistic estimation of differential gene expression profiles among rice accessions. The genomic information generated in our study is made publicly accessible through a user-friendly web-resource, “Oryza ISM-ILP marker” database. The known/candidate gene-derived ISM and ILP markers can be enormously deployed to identify functionally relevant trait-associated molecular tags by optimal-resource expenses, leading towards genomics-assisted crop improvement in rice. PMID:27032371

  8. Cost-Effectiveness Analysis of Diagnosis of Duchenne/Becker Muscular Dystrophy in Colombia.

    Science.gov (United States)

    Atehortúa, Sara C; Lugo, Luz H; Ceballos, Mateo; Orozco, Esteban; Castro, Paula A; Arango, Juan C; Mateus, Heidi E

    2018-03-09

    To determine the cost-effectiveness ratio of different courses of action for the diagnosis of Duchenne or Becker muscular dystrophy in Colombia. The cost-effectiveness analysis was performed from the Colombian health system perspective. Decision trees were constructed, and different courses of action were compared considering the following tests: immunohistochemistry (IHC), Western blot (WB), multiplex polymerase chain reaction, multiplex ligation-dependent probe amplification (MLPA), and the complete sequencing of the dystrophin gene. The time horizon matched the duration of sample extraction and analysis. Transition probabilities were obtained from a systematic review. Costs were constructed with a type-case methodology using the consensus of experts and the valuation of resources from consulting laboratories and the 2001 Social Security Institute cost manual. Deterministic sensitivity and scenario analyses were performed with one or more unavailable alternatives. Costs were converted from Colombian pesos to US dollars using the 2014 exchange rate. In the base case, WB was the dominant strategy, with a cost of US $419.07 and a sensitivity of 100%. This approach remains the dominant strategy down to a 98.2% sensitivity and while costs do not exceed US $837.38. If WB was not available, IHC had the best cost-effectiveness ratio, followed by MLPA and sequencing. WB is a cost-effective alternative for the diagnosis of patients suspected of having Duchenne or Becker muscular dystrophy in the Colombian health system. The IHC test is rated as the second-best detection method. If these tests are not available, MLPA followed by sequencing would be the most cost-effective alternative. Copyright © 2018. Published by Elsevier Inc.

  9. Vasa previa screening strategies: a decision and cost-effectiveness analysis.

    Science.gov (United States)

    Sinkey, R G; Odibo, A O

    2018-05-22

    The aim of this study is to perform a decision and cost-effectiveness analysis comparing four screening strategies for the antenatal diagnosis of vasa previa among singleton pregnancies. A decision-analytic model was constructed comparing vasa previa screening strategies. Published probabilities and costs were applied to four transvaginal screening scenarios which occurred at the time of mid-trimester ultrasound: no screening, ultrasound-indicated screening, screening pregnancies conceived by in vitro fertilization (IVF), and universal screening. Ultrasound-indicated screening was defined as performing a transvaginal ultrasound at the time of routine anatomy ultrasound in response to one of the following sonographic findings associated with an increased risk of vasa previa: low-lying placenta, marginal or velamentous cord insertion, or bilobed or succenturiate lobed placenta. The primary outcome was cost per quality adjusted life years (QALY) in U.S. dollars. The analysis was from a healthcare system perspective with a willingness to pay (WTP) threshold of $100,000 per QALY selected. One-way and multivariate sensitivity analyses (Monte-Carlo simulation) were performed. This decision-analytic model demonstrated that screening pregnancies conceived by IVF was the most cost-effective strategy with an incremental cost effectiveness ratio (ICER) of $29,186.50 / QALY. Ultrasound-indicated screening was the second most cost-effective with an ICER of $56,096.77 / QALY. These data were robust to all one-way and multivariate sensitivity analyses performed. Within our baseline assumptions, transvaginal ultrasound screening for vasa previa appears to be most cost-effective when performed among IVF pregnancies. However, both IVF and ultrasound-indicated screening strategies fall within contemporary willingness-to-pay thresholds, suggesting that both strategies may be appropriate to apply in clinical practice. This article is protected by copyright. All rights reserved. This

  10. Cost-Effectiveness Analysis of Isavuconazole vs. Voriconazole as First-Line Treatment for Invasive Aspergillosis.

    Science.gov (United States)

    Harrington, Rachel; Lee, Edward; Yang, Hongbo; Wei, Jin; Messali, Andrew; Azie, Nkechi; Wu, Eric Q; Spalding, James

    2017-01-01

    Invasive aspergillosis (IA) is associated with a significant clinical and economic burden. The phase III SECURE trial demonstrated non-inferiority in clinical efficacy between isavuconazole and voriconazole. No studies have evaluated the cost-effectiveness of isavuconazole compared to voriconazole. The objective of this study was to evaluate the costs and cost-effectiveness of isavuconazole vs. voriconazole for the first-line treatment of IA from the US hospital perspective. An economic model was developed to assess the costs and cost-effectiveness of isavuconazole vs. voriconazole in hospitalized patients with IA. The time horizon was the duration of hospitalization. Length of stay for the initial admission, incidence of readmission, clinical response, overall survival rates, and experience of adverse events (AEs) came from the SECURE trial. Unit costs were from the literature. Total costs per patient were estimated, composed of drug costs, costs of AEs, and costs of hospitalizations. Incremental costs per death avoided and per additional clinical responders were reported. Deterministic and probabilistic sensitivity analyses (DSA and PSA) were conducted. Base case analysis showed that isavuconazole was associated with a $7418 lower total cost per patient than voriconazole. In both incremental costs per death avoided and incremental costs per additional clinical responder, isavuconazole dominated voriconazole. Results were robust in sensitivity analysis. Isavuconazole was cost saving and dominant vs. voriconazole in most DSA. In PSA, isavuconazole was cost saving in 80.2% of the simulations and cost-effective in 82.0% of the simulations at the $50,000 willingness to pay threshold per additional outcome. Isavuconazole is a cost-effective option for the treatment of IA among hospitalized patients. Astellas Pharma Global Development, Inc.

  11. Cost-effectiveness analysis of FET PET-guided target selection for the diagnosis of gliomas

    International Nuclear Information System (INIS)

    Heinzel, Alexander; Stock, Stephanie; Mueller, Dirk; Langen, Karl-Josef

    2012-01-01

    Several diagnostic trials have indicated that the combined use of 18 F-fluoroethyl-l-tyrosine (FET) PET and MRI may be superior to MRI alone in selecting the biopsy site for the diagnosis of gliomas. We estimated the cost-effectiveness of the use of amino acid PET compared to MRI alone from the perspective of the German statutory health insurance. To evaluate the incremental cost-effectiveness of the use of amino acid PET, a decision tree model was built. The effectiveness of FET PET was determined by the probability of a correct diagnosis. Costs were estimated for a baseline scenario and for a more expensive scenario in which disease severity was considered. The robustness of the results was tested using deterministic and probabilistic sensitivity analyses. The combined use of PET and MRI resulted in an increase of 18.5% in the likelihood of a correct diagnosis. The incremental cost-effectiveness ratio for one additional correct diagnosis using FET PET was EUR6,405 for the baseline scenario and EUR9,114 for the scenario based on higher disease severity. The probabilistic sensitivity analysis confirmed the robustness of the results. The model indicates that the use of amino acid PET may be cost-effective in patients with glioma. As a result of several limitations in the data used for the model, further studies are needed to confirm the results. (orig.)

  12. Recommendations for Methicillin-Resistant Staphylococcus aureus Prevention in Adult ICUs: A Cost-Effectiveness Analysis.

    Science.gov (United States)

    Whittington, Melanie D; Atherly, Adam J; Curtis, Donna J; Lindrooth, Richard C; Bradley, Cathy J; Campbell, Jonathan D

    2017-08-01

    Patients in the ICU are at the greatest risk of contracting healthcare-associated infections like methicillin-resistant Staphylococcus aureus. This study calculates the cost-effectiveness of methicillin-resistant S aureus prevention strategies and recommends specific strategies based on screening test implementation. A cost-effectiveness analysis using a Markov model from the hospital perspective was conducted to determine if the implementation costs of methicillin-resistant S aureus prevention strategies are justified by associated reductions in methicillin-resistant S aureus infections and improvements in quality-adjusted life years. Univariate and probabilistic sensitivity analyses determined the influence of input variation on the cost-effectiveness. ICU. Hypothetical cohort of adults admitted to the ICU. Three prevention strategies were evaluated, including universal decolonization, targeted decolonization, and screening and isolation. Because prevention strategies have a screening component, the screening test in the model was varied to reflect commonly used screening test categories, including conventional culture, chromogenic agar, and polymerase chain reaction. Universal and targeted decolonization are less costly and more effective than screening and isolation. This is consistent for all screening tests. When compared with targeted decolonization, universal decolonization is cost-saving to cost-effective, with maximum cost savings occurring when a hospital uses more expensive screening tests like polymerase chain reaction. Results were robust to sensitivity analyses. As compared with screening and isolation, the current standard practice in ICUs, targeted decolonization, and universal decolonization are less costly and more effective. This supports updating the standard practice to a decolonization approach.

  13. Cost-effectiveness analysis of FET PET-guided target selection for the diagnosis of gliomas

    Energy Technology Data Exchange (ETDEWEB)

    Heinzel, Alexander [Research Centre Juelich, Department of Nuclear Medicine of the Heinrich-Heine University of Duesseldorf, Juelich (Germany); Stock, Stephanie; Mueller, Dirk [University Hospital of Cologne, Institute for Health Economics and Clinical Epidemiology, Cologne (Germany); Langen, Karl-Josef [Research Centre Juelich, Institute for Neuroscience and Medicine 4, Juelich (Germany)

    2012-07-15

    Several diagnostic trials have indicated that the combined use of {sup 18}F-fluoroethyl-l-tyrosine (FET) PET and MRI may be superior to MRI alone in selecting the biopsy site for the diagnosis of gliomas. We estimated the cost-effectiveness of the use of amino acid PET compared to MRI alone from the perspective of the German statutory health insurance. To evaluate the incremental cost-effectiveness of the use of amino acid PET, a decision tree model was built. The effectiveness of FET PET was determined by the probability of a correct diagnosis. Costs were estimated for a baseline scenario and for a more expensive scenario in which disease severity was considered. The robustness of the results was tested using deterministic and probabilistic sensitivity analyses. The combined use of PET and MRI resulted in an increase of 18.5% in the likelihood of a correct diagnosis. The incremental cost-effectiveness ratio for one additional correct diagnosis using FET PET was EUR6,405 for the baseline scenario and EUR9,114 for the scenario based on higher disease severity. The probabilistic sensitivity analysis confirmed the robustness of the results. The model indicates that the use of amino acid PET may be cost-effective in patients with glioma. As a result of several limitations in the data used for the model, further studies are needed to confirm the results. (orig.)

  14. Cost-Effectiveness Analysis of Three Leprosy Case Detection Methods in Northern Nigeria

    Science.gov (United States)

    Ezenduka, Charles; Post, Erik; John, Steven; Suraj, Abdulkarim; Namadi, Abdulahi; Onwujekwe, Obinna

    2012-01-01

    Background Despite several leprosy control measures in Nigeria, child proportion and disability grade 2 cases remain high while new cases have not significantly reduced, suggesting continuous spread of the disease. Hence, there is the need to review detection methods to enhance identification of early cases for effective control and prevention of permanent disability. This study evaluated the cost-effectiveness of three leprosy case detection methods in Northern Nigeria to identify the most cost-effective approach for detection of leprosy. Methods A cross-sectional study was carried out to evaluate the additional benefits of using several case detection methods in addition to routine practice in two north-eastern states of Nigeria. Primary and secondary data were collected from routine practice records and the Nigerian Tuberculosis and Leprosy Control Programme of 2009. The methods evaluated were Rapid Village Survey (RVS), Household Contact Examination (HCE) and Traditional Healers incentive method (TH). Effectiveness was measured as number of new leprosy cases detected and cost-effectiveness was expressed as cost per case detected. Costs were measured from both providers' and patients' perspectives. Additional costs and effects of each method were estimated by comparing each method against routine practise and expressed as incremental cost-effectiveness ratio (ICER). All costs were converted to the U.S. dollar at the 2010 exchange rate. Univariate sensitivity analysis was used to evaluate uncertainties around the ICER. Results The ICER for HCE was $142 per additional case detected at all contact levels and it was the most cost-effective method. At ICER of $194 per additional case detected, THs method detected more cases at a lower cost than the RVS, which was not cost-effective at $313 per additional case detected. Sensitivity analysis showed that varying the proportion of shared costs and subsistent wage for valuing unpaid time did not significantly change the

  15. Cost-effectiveness analysis of online hemodiafiltration versus high-flux hemodialysis

    Directory of Open Access Journals (Sweden)

    Ramponi F

    2016-09-01

    Full Text Available Francesco Ramponi,1,2 Claudio Ronco,1,3 Giacomo Mason,1 Enrico Rettore,4 Daniele Marcelli,5,6 Francesca Martino,1,3 Mauro Neri,1,7 Alejandro Martin-Malo,8 Bernard Canaud,5,9 Francesco Locatelli10 1International Renal Research Institute (IRRIV, San Bortolo Hospital, Vicenza, 2Department of Economics and Management, University of Padova, Padova, 3Department of Nephrology, San Bortolo Hospital, Vicenza, 4Department of Sociology and Social Research, University of Trento, FBK-IRVAPP & IZA, Trento, Italy; 5Europe, Middle East, Africa and Latin America Medical Board, Fresenius Medical Care,, Bad Homburg, Germany; 6Danube University, Krems, Austria; 7Department of Management and Engineering, University of Padova, Vicenza, Italy; 8Nephrology Unit, Reina Sofia University Hospital, Córdoba, Spain; 9School of Medicine, Montpellier University, Montpellier, France; 10Department of Nephrology, Manzoni Hospital, Lecco, Italy Background: Clinical studies suggest that hemodiafiltration (HDF may lead to better clinical outcomes than high-flux hemodialysis (HF-HD, but concerns have been raised about the cost-effectiveness of HDF versus HF-HD. Aim of this study was to investigate whether clinical benefits, in terms of longer survival and better health-related quality of life, are worth the possibly higher costs of HDF compared to HF-HD.Methods: The analysis comprised a simulation based on the combined results of previous published studies, with the following steps: 1 estimation of the survival function of HF-HD patients from a clinical trial and of HDF patients using the risk reduction estimated in a meta-analysis; 2 simulation of the survival of the same sample of patients as if allocated to HF-HD or HDF using three-state Markov models; and 3 application of state-specific health-related quality of life coefficients and differential costs derived from the literature. Several Monte Carlo simulations were performed, including simulations for patients with different

  16. A cost-effectiveness analysis of typhoid fever vaccines in US military personnel.

    Science.gov (United States)

    Warren, T A; Finder, S F; Brier, K L; Ries, A J; Weber, M P; Miller, M R; Potyk, R P; Reeves, C S; Moran, E L; Tornow, J J

    1996-11-01

    Typhoid fever has been a problem for military personnel throughout history. A cost-effectiveness analysis of typhoid fever vaccines from the perspective of the US military was performed. Currently 3 vaccine preparations are available in the US: an oral live Type 21A whole cell vaccine; a single-dose parenteral, cell subunit vaccine; and a 2-dose parenteral heat-phenol killed, whole cell vaccine. This analysis assumed all vaccinees were US military personnel. Two pharmacoeconomic models were developed, one for personnel who have not yet been deployed, and the other for personnel who are deployed to an area endemic for typhoid fever. Drug acquisition, administration, adverse effect and lost work costs, as well as the costs associated with typhoid fever, were included in this analysis. Unique military issues, typhoid fever attack rates, vaccine efficacy, and compliance with each vaccine's dosage regimen were included in this analysis. A sensitivity analysis was performed to test the robustness of the models. Typhoid fever immunisation is not cost-effective for US military personnel unless they are considered imminently deployable or are deployed. The most cost-effective vaccine for US military personnel is the single-dose, cell subunit parenteral vaccine.

  17. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  18. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  19. Bevacizumab for Metastatic Colorectal Cancer: A Global Cost-Effectiveness Analysis.

    Science.gov (United States)

    Goldstein, Daniel A; Chen, Qiushi; Ayer, Turgay; Chan, Kelvin K W; Virik, Kiran; Hammerman, Ariel; Brenner, Baruch; Flowers, Christopher R; Hall, Peter S

    2017-06-01

    In the U.S., the addition of bevacizumab to first-line chemotherapy in metastatic colorectal cancer (mCRC) has been demonstrated to provide 0.10 quality-adjusted life years (QALYs) at an incremental cost-effectiveness ratio (ICER) of $571,000/QALY. Due to variability in pricing, value for money may be different in other countries. Our objective was to establish the cost-effectiveness of bevacizumab in mCRC in the U.S., U.K., Canada, Australia, and Israel. We performed the analysis using a previously established Markov model for mCRC. Input data for efficacy, adverse events, and quality of life were considered to be generalizable and therefore identical for all countries. We used country-specific prices for medications, administration, and other health service costs. All costs were converted from local currency to U.S. dollars at the exchange rates in March 2016. We conducted one-way and probabilistic sensitivity analyses (PSA) to assess the model robustness across parameter uncertainties. Base case results demonstrated that the highest ICER was in the U.S. ($571,000/QALY) and the lowest was in Australia ($277,000/QALY). In Canada, the U.K., and Israel, ICERs ranged between $351,000 and $358,000 per QALY. PSA demonstrated 0% likelihood of bevacizumab being cost-effective in any country at a willingness to pay threshold of $150,000 per QALY. The addition of bevacizumab to first-line chemotherapy for mCRC consistently fails to be cost-effective in all five countries. There are large differences in cost-effectiveness between countries. This study provides a framework for analyzing the value of a cancer drug from the perspectives of multiple international payers. The cost-effectiveness of bevacizumab varies significantly between multiple countries. By conventional thresholds, bevacizumab is not cost-effective in metastatic colon cancer in the U.S., the U.K., Australia, Canada, and Israel. © AlphaMed Press 2017.

  20. Cost-effectiveness analysis of fidaxomicin versus vancomycin in Clostridium difficile infection.

    Science.gov (United States)

    Nathwani, Dilip; Cornely, Oliver A; Van Engen, Anke K; Odufowora-Sita, Olatunji; Retsa, Peny; Odeyemi, Isaac A O

    2014-11-01

    Fidaxomicin was non-inferior to vancomycin with respect to clinical cure rates in the treatment of Clostridium difficile infections (CDIs) in two Phase III trials, but was associated with significantly fewer recurrences than vancomycin. This economic analysis investigated the cost-effectiveness of fidaxomicin compared with vancomycin in patients with severe CDI and in patients with their first CDI recurrence. A 1 year time horizon Markov model with seven health states was developed from the perspective of Scottish public healthcare providers. Model inputs for effectiveness, resource use, direct costs and utilities were obtained from published sources and a Scottish expert panel. The main model outcome was the incremental cost-effectiveness ratio (ICER), expressed as cost per quality-adjusted life year (QALY), for fidaxomicin versus vancomycin; ICERs were interpreted using willingness-to-pay thresholds of £20,000/QALY and £30,000/QALY. One-way and probabilistic sensitivity analyses were performed. Total costs were similar with fidaxomicin and vancomycin in patients with severe CDI (£14,515 and £14,344, respectively) and in patients with a first recurrence (£16,535 and £16,926, respectively). Improvements in clinical outcomes with fidaxomicin resulted in small QALY gains versus vancomycin (severe CDI, +0.010; patients with first recurrence, +0.019). Fidaxomicin was cost-effective in severe CDI (ICER £16,529/QALY) and dominant (i.e. more effective and less costly) in patients with a first recurrence. The probability that fidaxomicin was cost-effective at a willingness-to-pay threshold of £30,000/QALY was 60% for severe CDI and 68% in a first recurrence. Fidaxomicin is cost-effective in patients with severe CDI and in patients with a first CDI recurrence versus vancomycin. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy.

  1. Cost-effectiveness analysis of different embryo transfer strategies in England.

    Science.gov (United States)

    Dixon, S; Faghih Nasiri, F; Ledger, W L; Lenton, E A; Duenas, A; Sutcliffe, P; Chilcott, J B

    2008-05-01

    The objective of this study was to assess the cost-effectiveness of different embryo transfer strategies for a single cycle when two embryos are available, and taking the NHS cost perspective. Cost-effectiveness model. Five in vitro fertilisation (IVF) centres in England between 2003/04 and 2004/05. Women with two embryos available for transfer in three age groups (Costs and adverse outcomes are estimated up to 5 years after the birth. Incremental cost per live birth was calculated for different embryo transfer strategies and for three separate age groups: less than 30, 30-35 and 36-39 years. Premature birth, neonatal intensive care unit admissions and days, cerebral palsy and incremental cost-effectiveness ratios. Single fresh embryo transfer (SET) plus frozen single embryo transfer (fzSET) is the more costly in terms of IVF costs, but the lower rates of multiple births mean that in terms of total costs, it is less costly than double embryo transfer (DET). Adverse events increase when moving from SET to SET+fzSET to DET. The probability of SET+fzSET being cost-effective decreases with age. When SET is included in the analysis, SET+fzSET no longer becomes a cost-effective option at any threshold value for all age groups studied. The analyses show that the choice of embryo transfer strategy is a function of four factors: the age of the mother, the relevance of the SET option, the value placed on a live birth and the relative importance placed on adverse outcomes. For each patient group, the choice of strategy is a trade-off between the value placed on a live birth and cost.

  2. Cost-effectiveness analysis of pneumococcal vaccination for infants in China.

    Science.gov (United States)

    Maurer, Kristin A; Chen, Huey-Fen; Wagner, Abram L; Hegde, Sonia T; Patel, Tejasi; Boulton, Matthew L; Hutton, David W

    2016-12-07

    Although China has a high burden of pneumococcal disease among young children, the government does not administer publicly-funded pneumococcal conjugate vaccines (PCV) through its Expanded Program on Immunization (EPI). We evaluated the cost-effectiveness of publicly-funded PCV-7, PCV-10, and PCV-13 vaccination programs for infants in China. Using a Markov model, we simulated a cohort of 16 million Chinese infants to estimate the impact of PCV-7, PCV-10, and PCV-13 vaccination programs from a societal perspective. We extrapolated health states to estimate the effects of the programs over the course of a lifetime of 75years. Parameters in the model were derived from a review of the literature. We found that PCV-7, PCV-10, and PCV-13 vaccination programs would be cost-effective compared to no vaccination. However, PCV-13 had the lowest incremental cost-effectiveness ratio ($11,464/QALY vs $16,664/QALY for PCV-10 and $18,224/QALY for PCV-7) due to a reduction in overall costs. Our sensitivity analysis revealed that the incremental cost-effectiveness ratios were most sensitive to the utility of acute otitis media, the cost of PCV-13, and the incidence of pneumonia and acute otitis media. The Chinese government should take steps to reduce the burden of pneumococcal diseases among young children through the inclusion of a pneumococcal conjugate vaccine in its EPI. Although all vaccinations would be cost-effective, PCV-13 would save more costs to the healthcare system and would be the preferred strategy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. The cost effectiveness of vancomycin for preventing infections after shoulder arthroplasty: a break-even analysis.

    Science.gov (United States)

    Hatch, M Daniel; Daniels, Stephen D; Glerum, Kimberly M; Higgins, Laurence D

    2017-03-01

    Increasing methicillin resistance and recognition of Propionibacterium acnes as a cause of infection in shoulder arthroplasty has led to the adoption of local vancomycin powder application as a more effective method to prevent expensive periprosthetic infections. However, no study has analyzed the cost effectiveness of vancomycin powder for preventing infection after shoulder replacement. Cost data for infection-related care of 16 patients treated for deep periprosthetic shoulder infection was collected from our institution for the break-even analysis. An equation was developed and applied to the data to determine how effective vancomycin powder would need to be at reducing a baseline infection rate to make prophylactic use cost effective. The efficacy of vancomycin (absolute risk reduction [ARR]) was evaluated at different unit costs, baseline infection rates, and average costs of treating infection. We determined vancomycin to be cost effective if the initial infection rate decreased by 0.04% (ARR). Using the current costs of vancomycin reported in the literature (range: $2.50/1000 mg to $44/1000 mg), we determined vancomycin to be cost effective with an ARR range of 0.01% at a cost of $2.50/1000 mg to 0.19% at $44/1000 mg. Baseline infection rate does not influence the ARR obtained at any specific cost of vancomycin or the cost of treating infection. We have derived and used a break-even equation to assess efficacy of prophylactic antibiotics during shoulder surgery. We further demonstrated the prophylactic administration of local vancomycin powder during shoulder arthroplasty to be a highly cost-effective practice. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  4. Preoperative paravertebral blocks for the management of acute pain following mastectomy: a cost-effectiveness analysis.

    Science.gov (United States)

    Offodile, Anaeze C; Sheckter, Clifford C; Tucker, Austin; Watzker, Anna; Ottino, Kevin; Zammert, Martin; Padula, William V

    2017-10-01

    Preoperative paravertebral blocks (PPVBs) are routinely used for treating post-mastectomy pain, yet uncertainties remain about the cost-effectiveness of this modality. We aim to evaluate the cost-effectiveness of PPVBs at common willingness-to-pay (WTP) thresholds. A decision analytic model compared two strategies: general anesthesia (GA) alone versus GA with multilevel PPVB. For the GA plus PPVB limb, patients were subjected to successful block placement versus varying severity of complications based on literature-derived probabilities. The need for rescue pain medication was the terminal node for all postoperative scenarios. Patient-reported pain scores sourced from published meta-analyses measured treatment effectiveness. Costing was derived from wholesale acquisition costs, the Medicare fee schedule, and publicly available hospital charge masters. Charges were converted to costs and adjusted for 2016 US dollars. A commercial payer perspective was adopted. Incremental cost-effectiveness ratios (ICERs) were evaluated against WTP thresholds of $500 and $50,000 for postoperative pain control. The ICER for preoperative paravertebral blocks was $154.49 per point reduction in pain score. 15% variation in inpatient costs resulted in ICER values ranging from $124.40-$180.66 per pain point score reduction. Altering the probability of block success by 5% generated ICER values of $144.71-$163.81 per pain score reduction. Probabilistic sensitivity analysis yielded cost-effective trials 69.43% of the time at $500 WTP thresholds. Over a broad range of probabilities, PPVB in mastectomy reduces postoperative pain at an acceptable incremental cost compared to GA. Commercial payers should be persuaded to reimburse this technique based on convincing evidence of cost-effectiveness.

  5. Cost-effectiveness analysis of countermeasures using accident consequence assessment models

    International Nuclear Information System (INIS)

    Alonso, A.; Gallego, E.

    1987-01-01

    In the event of a large release of radionuclides from a nuclear power plant, protective actions for the population potentially affected must be implemented. Cost-effectiveness analysis will be useful to define the countermeasures and the criteria needed to implement them. This paper shows the application of Accident Consequence Assessment (ACA) models to cost-effectiveness analysis of emergency and long-term countermeasures, making use of the different relationships between dose, contamination levels, affected areas and population distribution, included in such a model. The procedure is illustrated with the new Melcor Accident Consequence Code System (MACCS 1.3), developed at Sandia National Laboratories (USA), for a fixed accident scenario. Different alternative actions are evaluated with regard to their radiological and economical impact, searching for an 'optimum' strategy. (author)

  6. Cost-effectiveness analysis of lifestyle intervention in obese infertile women.

    Science.gov (United States)

    van Oers, A M; Mutsaerts, M A Q; Burggraaff, J M; Kuchenbecker, W K H; Perquin, D A M; Koks, C A M; van Golde, R; Kaaijk, E M; Schierbeek, J M; Klijn, N F; van Kasteren, Y M; Land, J A; Mol, B W J; Hoek, A; Groen, H

    2017-07-01

    control group. Exploratory scenario analyses showed that after changing the effectiveness outcome to all live births conceived within 24 months, irrespective of delivery within or after 24 months, cost-effectiveness of the lifestyle intervention improved. Using this effectiveness outcome, the probability that lifestyle intervention preceding infertility treatment was cost-effective in anovulatory women was 40%, in completers of the lifestyle intervention 39%, and in women ≥36 years 29%. In contrast to the study protocol, we were not able to perform the analysis from a societal perspective. Besides the primary outcome of the LIFEstyle study, we performed exploratory analyses using outcomes observed at longer follow-up times and we evaluated subgroups of women; the trial was not powered on these additional outcomes or subgroup analyses. Cost-effectiveness of a lifestyle intervention is more likely for longer follow-up times, and with live births conceived within 24 months as the effectiveness outcome. This effect was most profound in anovulatory women, in completers of the lifestyle intervention and in women ≥36 years old. This result indicates that the follow-up period of lifestyle interventions in obese infertile women is important. The scenario analyses performed in this study suggest that offering and reimbursing lifestyle intervention programmes in certain patient categories may be cost-effective and it provides directions for future research in this field. The study was supported by a grant from ZonMw, the Dutch Organization for Health Research and Development (50-50110-96-518). The department of obstetrics and gynaecology of the UMCG received an unrestricted educational grant from Ferring pharmaceuticals BV, The Netherlands. B.W.J.M. is a consultant for ObsEva, Geneva. The LIFEstyle RCT was registered at the Dutch trial registry (NTR 1530). http://www.trialregister.nl/trialreg/admin/rctview.asp?TC = 1530. © The Author 2017. Published by Oxford University Press

  7. The practical problems of applying cost-effectiveness analysis to joint finance programmes

    OpenAIRE

    Karen Gerard; Ken Wright

    1990-01-01

    Joint finance is money allocated by the Department of Health to NHS authorities to promote policies of inter-agency collaboration which prevent people being admitted to hospital or facilitate earlier discharge from hospital or save on NHS resources generally. Worries have been expressed that joint finance has not been used as effectively or efficiency as it might have been. This paper is concerned with the practical application of cost-effectiveness analysis to policies or schemes which typic...

  8. Screening strategies for atrial fibrillation: a systematic review and cost-effectiveness analysis.

    Science.gov (United States)

    Welton, Nicky J; McAleenan, Alexandra; Thom, Howard Hz; Davies, Philippa; Hollingworth, Will; Higgins, Julian Pt; Okoli, George; Sterne, Jonathan Ac; Feder, Gene; Eaton, Diane; Hingorani, Aroon; Fawsitt, Christopher; Lobban, Trudie; Bryden, Peter; Richards, Alison; Sofat, Reecha

    2017-05-01

    Atrial fibrillation (AF) is a common cardiac arrhythmia that increases the risk of thromboembolic events. Anticoagulation therapy to prevent AF-related stroke has been shown to be cost-effective. A national screening programme for AF may prevent AF-related events, but would involve a substantial investment of NHS resources. To conduct a systematic review of the diagnostic test accuracy (DTA) of screening tests for AF, update a systematic review of comparative studies evaluating screening strategies for AF, develop an economic model to compare the cost-effectiveness of different screening strategies and review observational studies of AF screening to provide inputs to the model. Systematic review, meta-analysis and cost-effectiveness analysis. Primary care. Adults. Screening strategies, defined by screening test, age at initial and final screens, screening interval and format of screening {systematic opportunistic screening [individuals offered screening if they consult with their general practitioner (GP)] or systematic population screening (when all eligible individuals are invited to screening)}. Sensitivity, specificity and diagnostic odds ratios; the odds ratio of detecting new AF cases compared with no screening; and the mean incremental net benefit compared with no screening. Two reviewers screened the search results, extracted data and assessed the risk of bias. A DTA meta-analysis was perfomed, and a decision tree and Markov model was used to evaluate the cost-effectiveness of the screening strategies. Diagnostic test accuracy depended on the screening test and how it was interpreted. In general, the screening tests identified in our review had high sensitivity (> 0.9). Systematic population and systematic opportunistic screening strategies were found to be similarly effective, with an estimated 170 individuals needed to be screened to detect one additional AF case compared with no screening. Systematic opportunistic screening was more likely to be cost-effective

  9. Missing data in trial-based cost-effectiveness analysis: An incomplete journey.

    Science.gov (United States)

    Leurent, Baptiste; Gomes, Manuel; Carpenter, James R

    2018-06-01

    Cost-effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial-based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty-two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost-effectiveness data was 63% (interquartile range: 47%-81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing-at-random assumption. Further improvements are needed to address missing data in cost-effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing-at-random assumption. © 2018 The Authors Health Economics published by John Wiley & Sons Ltd.

  10. Introduction to cost-effectiveness analysis of risk reduction measures in energy systems

    International Nuclear Information System (INIS)

    1986-07-01

    The aim of this report is to introduce readers to methods of cost-effectiveness analysis and their application in risk reduction, especially in connection with the energy-producing industries. The background to the assessment of risk and the problems in estimating it quantitatively are outlined. The methodology of cost-effectiveness analysis is then described, particular attention being given to the way in which results are derived and the overall use that can be made of them. This is followed by a discussion of quantitative applications and an outline of the methods that may be used to derive estimates both of risk and the cost of reducing it. The use of cost-effectiveness analysis is illustrated in an appendix, which gives as a worked example a case study on the reduction of public risk associated with radioactive releases during normal operation of a PWR. After drawing some general conclusions the report recommends that such analyses should normally be used as an aid to risk management whenever several alternative risk reduction measures are under consideration

  11. Effectiveness and cost-effectiveness of antidepressants in primary care: a multiple treatment comparison meta-analysis and cost-effectiveness model.

    Directory of Open Access Journals (Sweden)

    Joakim Ramsberg

    Full Text Available OBJECTIVE: To determine effectiveness and cost-effectiveness over a one-year time horizon of pharmacological first line treatment in primary care for patients with moderate to severe depression. DESIGN: A multiple treatment comparison meta-analysis was employed to determine the relative efficacy in terms of remission of 10 antidepressants (citalopram, duloxetine escitalopram, fluoxetine, fluvoxamine mirtazapine, paroxetine, reboxetine, sertraline and venlafaxine. The estimated remission rates were then applied in a decision-analytic model in order to estimate costs and quality of life with different treatments at one year. DATA SOURCES: Meta-analyses of remission rates from randomised controlled trials, and cost and quality-of-life data from published sources. RESULTS: The most favourable pharmacological treatment in terms of remission was escitalopram with an 8- to 12-week probability of remission of 0.47. Despite a high acquisition cost, this clinical effectiveness translated into escitalopram being both more effective and having a lower total cost than all other comparators from a societal perspective. From a healthcare perspective, the cost per QALY of escitalopram was €3732 compared with venlafaxine. CONCLUSION: Of the investigated antidepressants, escitalopram has the highest probability of remission and is the most effective and cost-effective pharmacological treatment in a primary care setting, when evaluated over a one year time-horizon. Small differences in remission rates may be important when assessing costs and cost-effectiveness of antidepressants.

  12. Cost-Effectiveness Analysis of Breast Cancer Control Interventions in Peru

    Science.gov (United States)

    Zelle, Sten G.; Vidaurre, Tatiana; Abugattas, Julio E.; Manrique, Javier E.; Sarria, Gustavo; Jeronimo, José; Seinfeld, Janice N.; Lauer, Jeremy A.; Sepulveda, Cecilia R.; Venegas, Diego; Baltussen, Rob

    2013-01-01

    Objectives In Peru, a country with constrained health resources, breast cancer control is characterized by late stage treatment and poor survival. To support breast cancer control in Peru, this study aims to determine the cost-effectiveness of different breast cancer control interventions relevant for the Peruvian context. Methods We performed a cost-effectiveness analysis (CEA) according to WHO-CHOICE guidelines, from a healthcare perspective. Different screening, early detection, palliative, and treatment interventions were evaluated using mathematical modeling. Effectiveness estimates were based on observational studies, modeling, and on information from Instituto Nacional de Enfermedades Neoplásicas (INEN). Resource utilizations and unit costs were based on estimates from INEN and observational studies. Cost-effectiveness estimates are in 2012 United States dollars (US$) per disability adjusted life year (DALY) averted. Results The current breast cancer program in Peru ($8,426 per DALY averted) could be improved through implementing triennial or biennial screening strategies. These strategies seem the most cost-effective in Peru, particularly when mobile mammography is applied (from $4,125 per DALY averted), or when both CBE screening and mammography screening are combined (from $4,239 per DALY averted). Triennially, these interventions costs between $63 million and $72 million per year. Late stage treatment, trastuzumab therapy and annual screening strategies are the least cost-effective. Conclusions Our analysis suggests that breast cancer control in Peru should be oriented towards early detection through combining fixed and mobile mammography screening (age 45-69) triennially. However, a phased introduction of triennial CBE screening (age 40-69) with upfront FNA in non-urban settings, and both CBE (age 40-49) and fixed mammography screening (age 50-69) in urban settings, seems a more feasible option and is also cost-effective. The implementation of this

  13. Cost-effectiveness analysis of tenofovir disoproxil fumarate for treatment of chronic hepatitis B in China.

    Science.gov (United States)

    Ke, Weixia; Zhang, Chi; Liu, Li; Gao, Yanhui; Yao, Zhenjiang; Ye, Xiaohua; Zhou, Shudong; Yang, Yi

    2016-11-01

    Tenofovir disoproxil fumarate (TDF) is newly available for treatment of chronic hepatitis B patients in China. To date, no study has been conducted to examine the cost-effectiveness of this treatment. The aim of this study was to estimate the cost-effectiveness of TDF versus four oral nucleos(t)ide analogs [lamivudine (LAM), adefovir (ADV), telbivudine (LdT), and entecavir (ETV)] and from a pharmacoeconomic perspective to assess current drug pricing for TDF. Based on Chinese healthcare perspectives, a Markov model was applied to simulate the lifetime (40-year time span) costs and quality-adjusted life-years (QALYs) for five different monotherapy strategies. Two kinds of rescue combination strategies (base-case: LAM + ADV then ETV + ADV; alternative: directly using ETV + ADV) were separately considered for treatment of patients refractory to monotherapy. Model parameters (including disease transition, cost, and utility) were obtained from previous Chinese population studies. Both branded and generic drugs were separately analyzed. Study model uncertainties were assessed by one-way and probabilistic sensitivity analyses. Two-way sensitivity analysis was used to explore uncertainties between efficacy and price of TDF. In the base-case analysis, the lowest lifetime cost and the best cost-effectiveness ratio were obtained by ETV, which was considered the reference treatment. LAM, ADV, and LdT treatments had significantly greater costs and lower efficacies. Compared to ETV, TDF was more effective but also more expensive. The incremental cost-effectiveness ratios of TDF versus ETV were much higher than the willing-to-pay threshold of $20,466 US dollars (USD) per QALY gained (3 × gross domestic product per capita of China, 2014). TDF would be the most cost-effective strategy if the annual cost did not exceed $2260 USD and $1600 USD for branded and generic drugs, respectively. For Chinese chronic hepatitis B patients, ETV is still the most cost-effective

  14. Transcriptome profiling to identify ATRA-responsive genes in human iPSC-derived endoderm for high-throughput point of departure analysis (SOT Annual Meeting)

    Science.gov (United States)

    Toxicological tipping points occur at chemical concentrations that overwhelm a cell’s adaptive response leading to permanent effects. We focused on retinoid signaling in differentiating endoderm to identify developmental pathways for tipping point analysis. Human induced pluripot...

  15. High-throughput fractionation of human plasma for fast enrichment of low- and high-abundance proteins.

    Science.gov (United States)

    Breen, Lucas; Cao, Lulu; Eom, Kirsten; Srajer Gajdosik, Martina; Camara, Lila; Giacometti, Jasminka; Dupuy, Damian E; Josic, Djuro

    2012-05-01

    Fast, cost-effective and reproducible isolation of IgM from plasma is invaluable to the study of IgM and subsequent understanding of the human immune system. Additionally, vast amounts of information regarding human physiology and disease can be derived from analysis of the low abundance proteome of the plasma. In this study, methods were optimized for both the high-throughput isolation of IgM from human plasma, and the high-throughput isolation and fractionation of low abundance plasma proteins. To optimize the chromatographic isolation of IgM from human plasma, many variables were examined including chromatography resin, mobile phases, and order of chromatographic separations. Purification of IgM was achieved most successfully through isolation of immunoglobulin from human plasma using Protein A chromatography with a specific resin followed by subsequent fractionation using QA strong anion exchange chromatography. Through these optimization experiments, an additional method was established to prepare plasma for analysis of low abundance proteins. This method involved chromatographic depletion of high-abundance plasma proteins and reduction of plasma proteome complexity through further chromatographic fractionation. Purification of IgM was achieved with high purity as confirmed by SDS-PAGE and IgM-specific immunoblot. Isolation and fractionation of low abundance protein was also performed successfully, as confirmed by SDS-PAGE and mass spectrometry analysis followed by label-free quantitative spectral analysis. The level of purity of the isolated IgM allows for further IgM-specific analysis of plasma samples. The developed fractionation scheme can be used for high throughput screening of human plasma in order to identify low and high abundance proteins as potential prognostic and diagnostic disease biomarkers.

  16. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  17. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  18. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  19. Timing of prophylactic surgery in prevention of diverticulitis recurrence: a cost-effectiveness analysis.

    Science.gov (United States)

    Richards, Robert J; Hammitt, James K

    2002-09-01

    Although surgery is recommended after two or more attacks of uncomplicated diverticulitis, the optimal timing for surgery in terms of cost-effectiveness is unknown. A Markov model was used to compare the costs and outcomes of performing surgery after one, two, or three uncomplicated attacks in 60-year-old hypothetical cohorts. Transition state probabilities were assigned values using published data and expert opinion. Costs were estimated from Medicare reimbursement rates. Surgery after the third attack is cost saving, yielding more years of life and quality adjusted life years at a lower cost than the other two strategies. The results were not sensitive to many of the variables tested in the model or to changes made in the discount rate (0-5%). In conclusion, performing prophylactic resection after the third attack of diverticulitis is cost saving in comparison to resection performed after the first or second attacks and remains cost-effective during sensitivity analysis.

  20. Monodisperse Water-in-Oil-in-Water (W/O/W Double Emulsion Droplets as Uniform Compartments for High-Throughput Analysis via Flow Cytometry

    Directory of Open Access Journals (Sweden)

    Jing Yan

    2013-12-01

    Full Text Available Here we report the application of monodisperse double emulsion droplets, produced in a single step within partially hydrophilic/partially hydrophobic microfluidic devices, as defined containers for quantitative flow cytometric analysis. Samples with varying fluorophore concentrations were generated, and a clear correlation between dye concentration and fluorescence signals was observed.

  1. Cytosolic Glutamine Synthetase is Important for Photosynthetic Efficiency and Water Use Efficiency in Potato as Revealed by High Throughput Sequencing QTL analysis

    DEFF Research Database (Denmark)

    Kaminski, Kacper Piotr; Sørensen, Kirsten Kørup; Andersen, Mathias Neumann

    2015-01-01

    was observed. Two extreme WUE bulks of clones were identified and pools of genomic DNA from them as well as the parents were sequenced and mapped to reference potato genome. Following a novel data analysis approach, two highly resolved QTLs were found on chromosome 1 and 9. Interestingly, three genes encoding...

  2. A cost-effectiveness analysis of screening for silent atrial fibrillation after ischaemic stroke.

    Science.gov (United States)

    Levin, Lars-Åke; Husberg, Magnus; Sobocinski, Piotr Doliwa; Kull, Viveka Frykman; Friberg, Leif; Rosenqvist, Mårten; Davidson, Thomas

    2015-02-01

    The purpose of this study was to estimate the cost-effectiveness of two screening methods for detection of silent AF, intermittent electrocardiogram (ECG) recordings using a handheld recording device, at regular time intervals for 30 days, and short-term 24 h continuous Holter ECG, in comparison with a no-screening alternative in 75-year-old patients with a recent ischaemic stroke. The long-term (20-year) costs and effects of all alternatives were estimated with a decision analytic model combining the result of a clinical study and epidemiological data from Sweden. The structure of a cost-effectiveness analysis was used in this study. The short-term decision tree model analysed the screening procedure until the onset of anticoagulant treatment. The second part of the decision model followed a Markov design, simulating the patients' health states for 20 years. Continuous 24 h ECG recording was inferior to intermittent ECG in terms of cost-effectiveness, due to both lower sensitivity and higher costs. The base-case analysis compared intermittent ECG screening with no screening of patients with recent stroke. The implementation of the screening programme on 1000 patients resulted over a 20-year period in 11 avoided strokes and the gain of 29 life-years, or 23 quality-adjusted life years, and cost savings of €55 400. Screening of silent AF by intermittent ECG recordings in patients with a recent ischaemic stroke is a cost-effective use of health care resources saving costs and lives and improving the quality of life. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  3. Cost-Effectiveness Analysis of Tyrosine Kinase Inhibitors for Patients with Advanced Gastrointestinal Stromal Tumors.

    Science.gov (United States)

    Nerich, Virginie; Fleck, Camille; Chaigneau, Loïc; Isambert, Nicolas; Borg, Christophe; Kalbacher, Elsa; Jary, Marine; Simon, Pauline; Pivot, Xavier; Blay, Jean-Yves; Limat, Samuel

    2017-01-01

    The management of advanced gastrointestinal stromal tumors (GISTs) has been modified considerably by the availability of costly tyrosine kinase inhibitors (TKIs); however, the best therapeutic sequence in terms of cost and effectiveness remains unknown. The aim of this study was to compare four potential strategies (reflecting the potential daily practice), each including imatinib 400 mg/day, as first-line treatment: S1 (imatinib 400 /best supportive care [BSC]); S2 (imatinib 400 /imatinib 800 /BSC); S3 (imatinib 400 /sunitinib/BSC); and S4 (imatinib 400 /imatinib 800 /sunitinib/BSC). A Markov model was developed with a hypothetical cohort of patients and a lifetime horizon. Transition probabilities were estimated from the results of clinical trials. The analysis was performed from the French payer perspective, and only direct medical costs were included. Clinical and economic parameters were discounted, and the robustness of results was assessed. The least costly and effective strategy was S1, at a cost of €65,744 for 32.9 life months (reference). S3 was the most cost-effective strategy, with an incremental cost-effectiveness ratio (ICER) of €48,277/life-year saved (LYS). S2 was dominated, and S4 yielded an ICER of €363,320/LYS compared with S3. Sensitivity analyses confirmed the robustness of these results; however, when taking into account a price reduction of 80 % for imatinib, S2 and S4 become the most cost-effective strategies. Our approach is innovative to the extent that our analysis takes into account the sequential application of TKIs. The results suggest that the S1 strategy is the best cost-effective strategy, but a price reduction of imatinib impacts on the results. This approach must continue, including new drugs and their impact on the quality of life of patients with advanced GISTs.

  4. Cost-effectiveness analysis of HPV vaccination: comparing the general population with socially vulnerable individuals.

    Science.gov (United States)

    Han, Kyu-Tae; Kim, Sun Jung; Lee, Seo Yoon; Park, Eun-Cheol

    2014-01-01

    After the WHO recommended HPV vaccination of the general population in 2009, government support of HPV vaccination programs was increased in many countries. However, this policy was not implemented in Korea due to perceived low cost-effectiveness. Thus, the aim of this study was to analyze the cost-utility of HPV vaccination programs targeted to high risk populations as compared to vaccination programs for the general population. Each study population was set to 100,000 people in a simulation study to determine the incremental cost-utility ratio (ICUR), then standard prevalence rates, cost, vaccination rates, vaccine efficacy, and the Quality-Adjusted Life-Years (QALYs) were applied to the analysis. In addition, sensitivity analysis was performed by assuming discounted vaccination cost. In the socially vulnerable population, QALYs gained through HPV vaccination were higher than that of the general population (General population: 1,019, Socially vulnerable population: 5,582). The results of ICUR showed that the cost of HPV vaccination was higher for the general population than the socially vulnerable population. (General population: 52,279,255 KRW, Socially vulnerable population: 9,547,347 KRW). Compared with 24 million KRW/QALYs as the social threshold, vaccination of the general population was not cost-effective. In contrast, vaccination of the socially vulnerable population was strongly cost-effective. The results suggest the importance and necessity of government support of HPV vaccination programs targeted to socially vulnerable populations because a targeted approach is much more cost-effective. The implementation of government support for such vaccination programs is a critical strategy for decreasing the burden of HPV infection in Korea.

  5. Cost-effectiveness analysis of a patient-centered care model for management of psoriasis.

    Science.gov (United States)

    Parsi, Kory; Chambers, Cindy J; Armstrong, April W

    2012-04-01

    Cost-effectiveness analyses help policymakers make informed decisions regarding funding allocation of health care resources. Cost-effectiveness analysis of technology-enabled models of health care delivery is necessary to assess sustainability of novel online, patient-centered health care models. We sought to compare cost-effectiveness of conventional in-office care with a patient-centered, online model for follow-up treatment of patients with psoriasis. Cost-effectiveness analysis was performed from a societal perspective on a randomized controlled trial comparing a patient-centered online model with in-office visits for treatment of patients with psoriasis during a 24-week period. Quality-adjusted life expectancy was calculated using the life table method. Costs were generated from the original study parameters and national averages for salaries and services. No significant difference existed in the mean change in Dermatology Life Quality Index scores between the two groups (online: 3.51 ± 4.48 and in-office: 3.88 ± 6.65, P value = .79). Mean improvement in quality-adjusted life expectancy was not significantly different between the groups (P value = .93), with a gain of 0.447 ± 0.48 quality-adjusted life years for the online group and a gain of 0.463 ± 0.815 quality-adjusted life years for the in-office group. The cost of follow-up psoriasis care with online visits was 1.7 times less than the cost of in-person visits ($315 vs $576). Variations in travel time existed among patients depending on their distance from the dermatologist's office. From a societal perspective, the patient-centered online care model appears to be cost saving, while maintaining similar effectiveness to standard in-office care. Copyright © 2011 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.

  6. Rapid high-throughput analysis of DNaseI hypersensitive sites using a modified Multiplex Ligation-dependent Probe Amplification approach

    Directory of Open Access Journals (Sweden)

    Sinclair Andrew H

    2009-09-01

    Full Text Available Abstract Background Mapping DNaseI hypersensitive sites is commonly used to identify regulatory regions in the genome. However, currently available methods are either time consuming and laborious, expensive or require large numbers of cells. We aimed to develop a quick and straightforward method for the analysis of DNaseI hypersensitive sites that overcomes these problems. Results We have developed a modified Multiplex Ligation-dependent Probe Amplification (MLPA approach for the identification and analysis of genomic regulatory regions. The utility of this approach was demonstrated by simultaneously analysing 20 loci from the ENCODE project for DNaseI hypersensitivity in a range of different cell lines. We were able to obtain reproducible results with as little as 5 × 104 cells per DNaseI treatment. Our results broadly matched those previously reported by the ENCODE project, and both technical and biological replicates showed high correlations, indicating the sensitivity and reproducibility of this method. Conclusion This new method will considerably facilitate the identification and analysis of DNaseI hypersensitive sites. Due to the multiplexing potential of MLPA (up to 50 loci can be examined it is possible to analyse dozens of DNaseI hypersensitive sites in a single reaction. Furthermore, the high sensitivity of MLPA means that fewer than 105 cells per DNaseI treatment can be used, allowing the discovery and analysis of tissue specific regulatory regions without the need for pooling. This method is quick and easy and results can be obtained within 48 hours after harvesting of cells or tissues. As no special equipment is required, this method can be applied by any laboratory interested in the analysis of DNaseI hypersensitive regions.

  7. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  8. Risk-benefit analysis and cost-effectiveness analysis of lung cancer screening by spiral CT

    International Nuclear Information System (INIS)

    Iinuma, Takeshi

    1999-01-01

    Mass screening of lung cancer has been widely performed using indirect chest X-ray method in Japan. However reduction of the mortality for lung cancer is questioned. We have proposed that recently developed spiral CT should be adopted for the screening of lung cancer, since CT has an excellent detectability for small nodule. Lung Cancer Screening CT (LSCT) has been developed by author's group using spiral CT with low dose and light weight in order to make a mobile unit. In this paper risk-benefit analysis and cost-effectiveness analysis are described for the LSCT screening of lung cancer. As a risk, radiation carcinogenesis due to exposure from LSCT are compared with gain of life-expectancy by screening and men of 40 years or more and women of 45 years or more are justified. The cost per person-year is estimated for LSCT screening which is better than that of present method, although total cost is higher. The LSCT screening could be recommended if total cost is affordable. (author)

  9. Guideline adherence is worth the effort: a cost-effectiveness analysis in intrauterine insemination care.

    Science.gov (United States)

    Haagen, E C; Nelen, W L D M; Adang, E M; Grol, R P T M; Hermens, R P M G; Kremer, J A M

    2013-02-01

    Is optimal adherence to guideline recommendations in intrauterine insemination (IUI) care cost-effective from a societal perspective when compared with suboptimal adherence to guideline recommendations? Optimal guideline adherence in IUI care has substantial economic benefits when compared with suboptimal guideline adherence. Fertility guidelines are tools to help health-care professionals, and patients make better decisions about clinically effective, safe and cost-effective care. Up to now, there has been limited published evidence about the association between guideline adherence and cost-effectiveness in fertility care. In a retrospective cohort study involving medical record analysis and a patient survey (n = 415), interviews with staff members (n = 13) and a review of hospitals' financial department reports and literature, data were obtained about patient characteristics, process aspects and clinical outcomes of IUI care and resources consumed. In the cost-effectiveness analyses, restricted to four relevant guideline recommendations, the ongoing pregnancy rate per couple (effectiveness), the average medical and non-medical costs of IUI care, possible additional IVF treatment, pregnancy, delivery and period from birth up to 6 weeks after birth for both mother and offspring per couple (costs) and the incremental net monetary benefits were calculated to investigate if optimal guideline adherence is cost-effective from a societal perspective when compared with suboptimal guideline adherence. Seven hundred and sixty five of 1100 randomly selected infertile couples from the databases of the fertility laboratories of 10 Dutch hospitals, including 1 large university hospital providing tertiary care and 9 public hospitals providing secondary care, were willing to participate, but 350 couples were excluded because of ovulatory disorders or the use of donated spermatozoa (n = 184), still ongoing IUI treatment (n = 143) or no access to their medical records (n = 23). As

  10. Toward a high-throughput method for determining vicine and convicine levels in faba bean seeds using flow injection analysis combined with tandem mass spectrometry.

    Science.gov (United States)

    Purves, Randy W; Khazaei, Hamid; Vandenberg, Albert

    2018-08-01

    Although faba bean provides environmental and health benefits, vicine and convicine (v-c) limit its use as a source of vegetable protein. Crop improvement efforts to minimize v-c concentration require low-cost, rapid screening methods to distinguish between high and low v-c genotypes to accelerate development of new cultivars and to detect out-crossing events. To assist crop breeders, we developed a unique and rapid screening method that uses a 60 s instrumental analysis step to accurately distinguish between high and low v-c genotypes. The method involves flow injection analysis (FIA) coupled with tandem mass spectrometry (i.e., selective reaction monitoring, SRM). Using seeds with known v-c levels as calibrants, measured v-c levels were comparable with liquid chromatography (LC)-SRM results and the method was used to screen 370 faba bean genotypes. Widespread use of FIA-SRM will accelerate breeding of low v-c faba bean, thereby alleviating concerns about anti-nutritional effects of v-c in this crop. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Bevacizumab in Treatment of High-Risk Ovarian Cancer—A Cost-Effectiveness Analysis

    Science.gov (United States)

    Herzog, Thomas J.; Hu, Lilian; Monk, Bradley J.; Kiet, Tuyen; Blansit, Kevin; Kapp, Daniel S.; Yu, Xinhua

    2014-01-01

    Objective. The objective of this study was to evaluate a cost-effectiveness strategy of bevacizumab in a subset of high-risk advanced ovarian cancer patients with survival benefit. Methods. A subset analysis of the International Collaboration on Ovarian Neoplasms 7 trial showed that additions of bevacizumab (B) and maintenance bevacizumab (mB) to paclitaxel (P) and carboplatin (C) improved the overall survival (OS) of high-risk advanced cancer patients. Actual and estimated costs of treatment were determined from Medicare payment. Incremental cost-effectiveness ratio per life-year saved was established. Results. The estimated cost of PC is $535 per cycle; PCB + mB (7.5 mg/kg) is $3,760 per cycle for the first 6 cycles and then $3,225 per cycle for 12 mB cycles. Of 465 high-risk stage IIIC (>1 cm residual) or stage IV patients, the previously reported OS after PC was 28.8 months versus 36.6 months in those who underwent PCB + mB. With an estimated 8-month improvement in OS, the incremental cost-effectiveness ratio of B was $167,771 per life-year saved. Conclusion. In this clinically relevant subset of women with high-risk advanced ovarian cancer with overall survival benefit after bevacizumab, our economic model suggests that the incremental cost of bevacizumab was approximately $170,000. PMID:24721817

  12. [Cost-effectiveness analysis of adjuvant anastrozol in post-menopausal women with breast cancer].

    Science.gov (United States)

    Sasse, Andre Deeke; Sasse, Emma Chen

    2009-01-01

    Carry out an economic analysis of the incorporation of anastrozole as adjuvant hormone therapy in postmenopausal women with breast cancer in a Brazilian setting. The cost-effectiveness estimate comparing anastrozole to tamoxifen was made from the perspectives of the patient, private health insurance, and government. A Markov model was designed based on data from ATAC trial after 100 months follow-up in a hypothetical cohort of 1000 postmenopausal women in Brazil, using outcomes projections for a 25-year period. Resource utilization and associated costs were obtained from preselected sources and specialists' opinions. Treatment costs varied according to the perspective used. The incremental benefit was inserted in the model to obtain the cost of quality-adjusted life-year gained (QALY). Benefit extrapolations for a 25-year time line showed an estimate of 0.29 QALY gained with anastrozole compared to tamoxifen. The cost-effectiveness ratio per QALY gained depended on which perspective was used. There was an increment of R$ 32.403,00/QALY in the public health system/government, R$ 32.230,00/QALY for private health system, and R$ 55.270,00/QALY for patients. The benefit from adjuvant anastrozole in postmenopausal patients with breast cancer is associated to major differences in cost-effectiveness ratio and varies with the different perspectives. According to current WHO parameters, the increment is considered acceptable under public and private health system perspectives, but not from that of the patient.

  13. Cost-effectiveness analysis of cataract surgery with intraocular lens implantation: extracapsular cataract extraction versus phacoemulsification

    Directory of Open Access Journals (Sweden)

    Mohd R.A. Manaf

    2007-03-01

    Full Text Available A randomized single blinded clinical trial to compare the cost-effectiveness of cataract surgery between extracapsular cataract extraction (ECCE and phacoemulsification (PEA was conducted at Hospital Universiti Kebangsaan Malaysia (HUKM from March 2000 until August 2001. The cost of a cataract surgery incurred by hospital, patients and households were calculated preoperatively, one week, two months (for both techniques and six months (for ECCE only. Effectiveness of cataract surgery was assessed using Visual Function 14 (VF-14, quality of life measurement specifically for vision. The cost analysis results from each 50 subjects of ECCE and PEA group showed that average cost for one ECCE after six months post-operation is USD 458 (± USD 72 and for PEA is USD 528 (± USD 12