WorldWideScience

Sample records for cost-effective high-throughput analysis

  1. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    Full Text Available BACKGROUND: Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges. METHODOLOGY/PRINCIPAL FINDINGS: We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences. CONCLUSIONS/SIGNIFICANCE: Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall

  2. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  3. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Romao, Joana; Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for lar

  4. A perspective on high throughput analysis of pesticide residues in foods

    Institute of Scientific and Technical Information of China (English)

    Kai ZHANG; Jon W WONG; Perry G WANG

    2011-01-01

    The screening of pesticide residues plays a vital role in food safety. Applications of high throughput analytical procedures are desirable for screening a large number of pesticides and food samples in a time-effi- cient and cost-effective manner. This review discusses how sample throughput of pesticide analysis could be improved with an emphasis on sample preparation, instrumentation and data analysis.

  5. High-throughput Binary Vectors for Plant Gene Function Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhi-Yong Lei; Ping Zhao; Min-Jie Cao; Rong Cui; Xi Chen; Li-Zhong Xiong; Qi-Fa Zhang; David J. Oliver; Cheng-Bin Xiang

    2007-01-01

    A series of high-throughput binary cloning vectors were constructed to facilitate gene function analysis in higher plants. This vector series consists of plasmids designed for plant expression, promoter analysis, gene silencing,and green fluorescent protein fusions for protein localization. These vectors provide for high-throughput and efficient cloning utilizing sites for λ phage integrase/excisionase. In addition, unique restriction sites are incorporated in a multiple cloning site and enable promoter replacement. The entire vector series are available with complete sequence information and detailed annotations and are freely distributed to the scientific community for non-commercial uses.

  6. Automatic Spot Identification for High Throughput Microarray Analysis

    Science.gov (United States)

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  7. MIPHENO: data normalization for high throughput metabolite analysis

    Directory of Open Access Journals (Sweden)

    Bell Shannon M

    2012-01-01

    Full Text Available Abstract Background High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course of months and years, often without the controls needed to compare directly across the dataset. Few methods are available to facilitate comparisons of high throughput metabolic data generated in batches where explicit in-group controls for normalization are lacking. Results Here we describe MIPHENO (Mutant Identification by Probabilistic High throughput-Enabled Normalization, an approach for post-hoc normalization of quantitative first-pass screening data in the absence of explicit in-group controls. This approach includes a quality control step and facilitates cross-experiment comparisons that decrease the false non-discovery rates, while maintaining the high accuracy needed to limit false positives in first-pass screening. Results from simulation show an improvement in both accuracy and false non-discovery rate over a range of population parameters (p -16 and a modest but significant (p -16 improvement in area under the receiver operator characteristic curve of 0.955 for MIPHENO vs 0.923 for a group-based statistic (z-score. Analysis of the high throughput phenotypic data from the Arabidopsis Chloroplast 2010 Project (http://www.plastid.msu.edu/ showed ~ 4-fold increase in the ability to detect previously described or expected phenotypes over the group based statistic. Conclusions Results demonstrate MIPHENO offers substantial benefit in improving the ability to detect putative mutant phenotypes from post-hoc analysis of large data sets. Additionally, it facilitates data interpretation and permits cross-dataset comparison where group-based controls are missing. MIPHENO is applicable to a wide range of high throughput screenings and the code is

  8. Mass spectrometry for high-throughput metabolomics analysis of urine

    OpenAIRE

    Abdelrazig, Salah M.A.

    2015-01-01

    Direct electrospray ionisation-mass spectrometry (direct ESI-MS), by omitting the chromatographic step, has great potential for application as a high-throughput approach for untargeted urine metabolomics analysis compared to liquid chromatography-mass spectrometry (LC-MS). The rapid development and technical innovations revealed in the field of ambient ionisation MS such as nanoelectrospray ionisation (nanoESI) chip-based infusion and liquid extraction surface analysis mass spectrometry (LESA...

  9. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  10. A colorimetric bioassay for high-throughput and cost-effectively assessing anti-foot-and-mouth disease virus activity.

    Science.gov (United States)

    Ramanathan, Palaniappan; Zhu, James J; Bishop, Elizabeth A; Puckette, Michael C; Hartwig, Ethan; Grubman, Marvin J; Rodriguez, Luis L

    2015-03-15

    Foot-and-mouth disease virus (FMDV) is one of the most contagious animal viruses. This virus is very sensitive to inhibition by type I interferons. Currently, a bioassay based on plaque reduction is used to measure anti-FMDV activity of porcine IFNs. The plaque reduction assay is tedious and difficult to utilize for high-throughput analysis. Using available FMDV susceptible bovine and porcine cells, we developed and tested a colorimetric assay based on cytopathic effect reduction for its ability to quantify FMDV-specific antiviral activity of bovine and porcine type I interferons. Our results show that this new method has significant advantages over other assays in terms of labor intensity, cost, high-throughput capability and/or anti-FMDV specific activity because of simpler procedures and direct measurement of antiviral activity. Several assay conditions were tested to optimize the procedures. The test results show that the assay can be standardized with fixed conditions and a standard or a reference for measuring antiviral activity as units. This is an excellent assay in terms of sensitivity and accuracy based on a statistical evaluation. The results obtained with this assay were highly correlated with a conventional virus titration method.

  11. High-throughput synthesis and analysis of acylated cyanohydrins.

    Science.gov (United States)

    Hamberg, Anders; Lundgren, Stina; Wingstrand, Erica; Moberg, Christina; Hult, Karl

    2007-01-01

    The yields and optical purities of products obtained from chiral Lewis acid/Lewis base-catalysed additions of alpha-ketonitriles to prochiral aldehydes could be accurately determined by an enzymatic method. The amount of remaining aldehyde was determined after its reduction to an alcohol, whilst the two product enantiomers were analysed after subsequent hydrolysis first by the (S)-selective Candida antarctica lipase B and then by the unselective pig liver esterase. The method could be used for analysis of products obtained from a number of aromatic aldehydes and aliphatic ketonitriles. Microreactor technology was successfully combined with high-throughput analysis for efficient catalyst optimization.

  12. Evaluation of a high throughput starch analysis optimised for wood.

    Directory of Open Access Journals (Sweden)

    Chandra Bellasio

    Full Text Available Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11 was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood of four species (coniferous and flowering plants. The optimised protocol proved to be remarkably precise and accurate (3%, suitable for a high throughput routine analysis (35 samples a day of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  13. Evaluation of a high throughput starch analysis optimised for wood.

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  14. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  15. High-throughput protein analysis integrating bioinformatics and experimental assays.

    Science.gov (United States)

    del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan

    2004-01-01

    The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins.

  16. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  17. Interactive Visual Analysis of High Throughput Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; Goodall, John R [ORNL; Maness, Christopher S [ORNL; Senter, James K [ORNL; Potok, Thomas E [ORNL

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  18. A cost-effective high-throughput metabarcoding approach powerful enough to genotype ~44 000 year-old rodent remains from Northern Africa.

    Science.gov (United States)

    Guimaraes, S; Pruvost, M; Daligault, J; Stoetzel, E; Bennett, E A; Côté, N M-L; Nicolas, V; Lalis, A; Denys, C; Geigl, E-M; Grange, T

    2016-07-04

    We present a cost-effective metabarcoding approach, aMPlex Torrent, which relies on an improved multiplex PCR adapted to highly degraded DNA, combining barcoding and next-generation sequencing to simultaneously analyse many heterogeneous samples. We demonstrate the strength of these improvements by generating a phylochronology through the genotyping of ancient rodent remains from a Moroccan cave whose stratigraphy covers the last 120 000 years. Rodents are important for epidemiology, agronomy and ecological investigations and can act as bioindicators for human- and/or climate-induced environmental changes. Efficient and reliable genotyping of ancient rodent remains has the potential to deliver valuable phylogenetic and paleoecological information. The analysis of multiple ancient skeletal remains of very small size with poor DNA preservation, however, requires a sensitive high-throughput method to generate sufficient data. We show this approach to be particularly adapted at accessing this otherwise difficult taxonomic and genetic resource. As a highly scalable, lower cost and less labour-intensive alternative to targeted sequence capture approaches, we propose the aMPlex Torrent strategy to be a useful tool for the genetic analysis of multiple degraded samples in studies involving ecology, archaeology, conservation and evolutionary biology.

  19. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  20. Comprehensive analysis of high-throughput screening data

    Science.gov (United States)

    Heyse, Stephan

    2002-06-01

    High-Throughput Screening (HTS) data in its entirety is a valuable raw material for the drug-discovery process. It provides the most compete information about the biological activity of a company's compounds. However, its quantity, complexity and heterogeneity require novel, sophisticated approaches in data analysis. At GeneData, we are developing methods for large-scale, synoptical mining of screening data in a five-step analysis: (1) Quality Assurance: Checking data for experimental artifacts and eliminating low quality data. (2) Biological Profiling: Clustering and ranking of compounds based on their biological activity, taking into account specific characteristics of HTS data. (3) Rule-based Classification: Applying user-defined rules to biological and chemical properties, and providing hypotheses on the biological mode-of-action of compounds. (4) Joint Biological-Chemical Analysis: Associating chemical compound data to HTS data, providing hypotheses for structure- activity relationships. (5) integration with Genomic and Gene Expression Data: Linking into other components of GeneData's bioinformatics platform, and assessing the compounds' modes-of-action, toxicity, and metabolic properties. These analyses address issues that are crucial for a correct interpretation and full exploitation of screening data. They lead to a sound rating of assays and compounds at an early state of the lead-finding process.

  1. Computational Proteomics: High-throughput Analysis for Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  2. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  3. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Directory of Open Access Journals (Sweden)

    Soichi Inagaki

    Full Text Available Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  4. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  5. High-throughput genetic analysis using time-resolved fluorometry and closed-tube detection.

    Science.gov (United States)

    Nurmi, J; Kiviniemi, M; Kujanpää, M; Sjöroos, M; Ilonen, J; Lövgren, T

    2001-12-15

    Robust methods for genetic analysis are required for efficient exploitation of the constantly accumulating genetic information. We describe a closed-tube genotyping method suitable for high-throughput screening of genetic markers. The method is based on allele-specific probes labeled with an environment-sensitive lanthanide chelate, the fluorescence intensity of which is significantly increased upon PCR amplification of a complementary target. Genomic DNA samples were analyzed in an insulin gene single nucleotide polymorphism (SNP) assay using universal amplification primers and probes that recognized the two different alleles. The feasibility of dry reagent based all-in-one PCR assays was tested using another diabetes-related genetic marker, human leukocyte antigen DQB1 allele *0302 as a model analyte in a dual-color, closed-tube end-point assay. There was a 100% correlation between the novel SNP assay and a conventional PCR restriction fragment length polymorphism assay. It was also demonstrated that using real-time monitoring, accurate genotyping results can be obtained despite strongly cross-reacting probes, minimizing the time and effort needed for optimization of probe sequence. Throughput can be maximized by using predried PCR mixtures that are stable for at least 6 months. This homogenous, all-in-one dry reagent assay chemistry permits cost-effective genetic screening on a large scale.

  6. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  7. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  8. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  9. High-throughput behavioral analysis in C. elegans.

    Science.gov (United States)

    Swierczek, Nicholas A; Giles, Andrew C; Rankin, Catharine H; Kerr, Rex A

    2011-06-05

    We designed a real-time computer vision system, the Multi-Worm Tracker (MWT), which can simultaneously quantify the behavior of dozens of Caenorhabditis elegans on a Petri plate at video rates. We examined three traditional behavioral paradigms using this system: spontaneous movement on food, where the behavior changes over tens of minutes; chemotaxis, where turning events must be detected accurately to determine strategy; and habituation of response to tap, where the response is stochastic and changes over time. In each case, manual analysis or automated single-worm tracking would be tedious and time-consuming, but the MWT system allowed rapid quantification of behavior with minimal human effort. Thus, this system will enable large-scale forward and reverse genetic screens for complex behaviors.

  10. Regulatory pathway analysis by high-throughput in situ hybridization.

    Directory of Open Access Journals (Sweden)

    Axel Visel

    2007-10-01

    Full Text Available Automated in situ hybridization enables the construction of comprehensive atlases of gene expression patterns in mammals. Such atlases can become Web-searchable digital expression maps of individual genes and thus offer an entryway to elucidate genetic interactions and signaling pathways. Towards this end, an atlas housing approximately 1,000 spatial gene expression patterns of the midgestation mouse embryo was generated. Patterns were textually annotated using a controlled vocabulary comprising >90 anatomical features. Hierarchical clustering of annotations was carried out using distance scores calculated from the similarity between pairs of patterns across all anatomical structures. This process ordered hundreds of complex expression patterns into a matrix that reflects the embryonic architecture and the relatedness of patterns of expression. Clustering yielded 12 distinct groups of expression patterns. Because of the similarity of expression patterns within a group, members of each group may be components of regulatory cascades. We focused on the group containing Pax6, an evolutionary conserved transcriptional master mediator of development. Seventeen of the 82 genes in this group showed a change of expression in the developing neocortex of Pax6-deficient embryos. Electromobility shift assays were used to test for the presence of Pax6-paired domain binding sites. This led to the identification of 12 genes not previously known as potential targets of Pax6 regulation. These findings suggest that cluster analysis of annotated gene expression patterns obtained by automated in situ hybridization is a novel approach for identifying components of signaling cascades.

  11. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis (Intestinal-

  12. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  13. High-throughput data pipelines for metabolic flux analysis in plants.

    Science.gov (United States)

    Poskar, C Hart; Huege, Jan; Krach, Christian; Shachar-Hill, Yair; Junker, Björn H

    2014-01-01

    In this chapter we illustrate the methodology for high-throughput metabolic flux analysis. Central to this is developing an end to end data pipeline, crucial for integrating the wet lab experiments and analytics, combining hardware and software automation, and standardizing data representation providing importers and exporters to support third party tools. The use of existing software at the start, data extraction from the chromatogram, and the end, MFA analysis, allows for the most flexibility in this workflow. Developing iMS2Flux provided a standard, extensible, platform independent tool to act as the "glue" between these end points. Most importantly this tool can be easily adapted to support different data formats, data verification and data correction steps allowing it to be central to managing the data necessary for high-throughput MFA. An additional tool was needed to automate the MFA software and in particular to take advantage of the course grained parallel nature of high-throughput analysis and available high performance computing facilities.In combination these methods show the development of high-throughput pipelines that allow metabolic flux analysis to join as a full member of the omics family.

  14. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis......, multivariate analysis and data interpretation. We furthermore discuss the potential of future developments that will help to gain deep insight into the PTM-ome and its biological role in cells....

  15. High-throughput STR analysis for DNA database using direct PCR.

    Science.gov (United States)

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner.

  16. Optically encoded microspheres for high-throughput analysis of genes and proteins

    Science.gov (United States)

    Gao, Xiaohu; Han, Mingyong; Nie, Shuming

    2002-06-01

    We have developed a novel optical coding technology for massively parallel and high-throughput analysis of biological molecules. Its unprecedented multiplexing capability is based on the unique optical properties of semiconductor quantum dots (QDs) and the ability to incorporate multicolor QQs into small polymer beads at precisely controlled ratios. The use of 10 intensity levels and 6 colors could theoretically code one million nucleic acid or protein sequences. Imaging and spectroscopic studies indicate that the QD tagged beads are highly uniform and reproducible, yielding bead identification accuracies as high as 99.99 percent under favorable conditions. DNA hybridization results demonstrate that the coding and target signals can be simultaneously read at the single-bead level. This spectral coding technology is expected to open new opportunities in gene expression studies, high-throughput screening, and medical diagnosis.

  17. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions.

  18. High-throughput engineering and analysis of peptide binding to class II MHC.

    Science.gov (United States)

    Jiang, Wei; Boder, Eric T

    2010-07-27

    Class II major histocompatibility complex (MHC-II) proteins govern stimulation of adaptive immunity by presenting antigenic peptides to CD4+ T lymphocytes. Many allelic variants of MHC-II exist with implications in peptide presentation and immunity; thus, high-throughput experimental tools for rapid and quantitative analysis of peptide binding to MHC-II are needed. Here, we present an expression system wherein peptide and MHC-II are codisplayed on the surface of yeast in an intracellular association-dependent manner and assayed by flow cytometry. Accordingly, the relative binding of different peptides and/or MHC-II variants can be assayed by genetically manipulating either partner, enabling the application of directed evolution approaches for high-throughput characterization or engineering. We demonstrate the application of this tool to map the side-chain preference for peptides binding to HLA-DR1 and to evolve novel HLA-DR1 mutants with altered peptide-binding specificity.

  19. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  20. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets.

  1. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    Science.gov (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  2. High-throughput SHAPE and hydroxyl radical analysis of RNA structure and ribonucleoprotein assembly.

    Science.gov (United States)

    McGinnis, Jennifer L; Duncan, Caia D S; Weeks, Kevin M

    2009-01-01

    RNA folds to form complex structures vital to many cellular functions. Proteins facilitate RNA folding at both the secondary and tertiary structure levels. An absolute prerequisite for understanding RNA folding and ribonucleoprotein (RNP) assembly reactions is a complete understanding of the RNA structure at each stage of the folding or assembly process. Here we provide a guide for comprehensive and high-throughput analysis of RNA secondary and tertiary structure using SHAPE and hydroxyl radical footprinting. As an example of the strong and sometimes surprising conclusions that can emerge from high-throughput analysis of RNA folding and RNP assembly, we summarize the structure of the bI3 group I intron RNA in four distinct states. Dramatic structural rearrangements occur in both secondary and tertiary structure as the RNA folds from the free state to the active, six-component, RNP complex. As high-throughput and high-resolution approaches are applied broadly to large protein-RNA complexes, other proteins previously viewed as making simple contributions to RNA folding are also likely to be found to exert multifaceted, long-range, cooperative, and nonadditive effects on RNA folding. These protein-induced contributions add another level of control, and potential regulatory function, in RNP complexes.

  3. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  4. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  5. Green Infrastructure Siting and Cost Effectiveness Analysis

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Parcel scale green infrastructure siting and cost effectiveness analysis. You can find more details at the project's website.

  6. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis

    Directory of Open Access Journals (Sweden)

    Na Wen

    2016-07-01

    Full Text Available This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1 prototype demonstration of single-cell encapsulation in microfluidic droplets; (2 technical improvements of single-cell encapsulation in microfluidic droplets; (3 microfluidic droplets enabling single-cell proteomic analysis; (4 microfluidic droplets enabling single-cell genomic analysis; and (5 integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  7. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping.

    Science.gov (United States)

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-06-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays 'Fernandez') plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable.

  8. Automated analysis of NF-κB nuclear translocation kinetics in high-throughput screening.

    Directory of Open Access Journals (Sweden)

    Zi Di

    Full Text Available Nuclear entry and exit of the NF-κB family of dimeric transcription factors plays an essential role in regulating cellular responses to inflammatory stress. The dynamics of this nuclear translocation can vary significantly within a cell population and may dramatically change e.g. upon drug exposure. Furthermore, there is significant heterogeneity in individual cell response upon stress signaling. In order to systematically determine factors that define NF-κB translocation dynamics, high-throughput screens that enable the analysis of dynamic NF-κB responses in individual cells in real time are essential. Thus far, only NF-κB downstream signaling responses of whole cell populations at the transcriptional level are in high-throughput mode. In this study, we developed a fully automated image analysis method to determine the time-course of NF-κB translocation in individual cells, suitable for high-throughput screenings in the context of compound screening and functional genomics. Two novel segmentation methods were used for defining the individual nuclear and cytoplasmic regions: watershed masked clustering (WMC and best-fit ellipse of Voronoi cell (BEVC. The dynamic NFκB oscillatory response at the single cell and population level was coupled to automated extraction of 26 analogue translocation parameters including number of peaks, time to reach each peak, and amplitude of each peak. Our automated image analysis method was validated through a series of statistical tests demonstrating computational efficient and accurate NF-κB translocation dynamics quantification of our algorithm. Both pharmacological inhibition of NF-κB and short interfering RNAs targeting the inhibitor of NFκB, IκBα, demonstrated the ability of our method to identify compounds and genetic players that interfere with the nuclear transition of NF-κB.

  9. Quantitative dot blot analysis (QDB), a versatile high throughput immunoblot method.

    Science.gov (United States)

    Tian, Geng; Tang, Fangrong; Yang, Chunhua; Zhang, Wenfeng; Bergquist, Jonas; Wang, Bin; Mi, Jia; Zhang, Jiandi

    2017-08-29

    Lacking access to an affordable method of high throughput immunoblot analysis for daily use remains a big challenge for scientists worldwide. We proposed here Quantitative Dot Blot analysis (QDB) to meet this demand. With the defined linear range, QDB analysis fundamentally transforms traditional immunoblot method into a true quantitative assay. Its convenience in analyzing large number of samples also enables bench scientists to examine protein expression levels from multiple parameters. In addition, the small amount of sample lysates needed for analysis means significant saving in research sources and efforts. This method was evaluated at both cellular and tissue levels with unexpected observations otherwise would be hard to achieve using conventional immunoblot methods like Western blot analysis. Using QDB technique, we were able to observed an age-dependent significant alteration of CAPG protein expression level in TRAMP mice. We believe that the adoption of QDB analysis would have immediate impact on biological and biomedical research to provide much needed high-throughput information at protein level in this "Big Data" era.

  10. High Throughput Plasmid Sequencing with Illumina and CLC Bio (Seventh Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting 2012)

    Energy Technology Data Exchange (ETDEWEB)

    Athavale, Ajay [Monsanto

    2012-06-01

    Ajay Athavale (Monsanto) presents "High Throughput Plasmid Sequencing with Illumina and CLC Bio" at the 7th Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting held in June, 2012 in Santa Fe, NM.

  11. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  12. Rapid screening of classic galactosemia patients: a proof-of-concept study using high-throughput FTIR analysis of plasma.

    Science.gov (United States)

    Lacombe, Caroline; Untereiner, Valérie; Gobinet, Cyril; Zater, Mokhtar; Sockalingum, Ganesh D; Garnotel, Roselyne

    2015-04-07

    Classic galactosemia is an autosomal recessive metabolic disease involving the galactose pathway, caused by the deficiency of galactose-1-phosphate uridyltransferase. Galactose accumulation induces in newborns many symptoms, such as liver disease, cataracts, and sepsis leading to death if untreated. Neonatal screening is developed and applied in many countries using several methods to detect galactose or its derived product accumulation in blood or urine. High-throughput FTIR spectroscopy was investigated as a potential tool in the current screening methods. IR spectra were obtained from blood plasma of healthy, diabetic, and galactosemic patients. The major spectral differences were in the carbohydrate region, which was first analysed in an exploratory manner using principal component analysis (PCA). PCA score plots showed a clear discrimination between diabetic and galactosemic patients and this was more marked as a function of the glucose and galactose increased concentration in these patients' plasma respectively. Then, a support vector machine leave-one-out cross-validation (SVM-LOOCV) classifier was built with the PCA scores as the input and the model was tested on median, mean and all spectra from the three population groups. This classifier was able to discriminate healthy/diabetic, healthy/galactosemic, and diabetic/galactosemic patients with sensitivity and specificity rates ranging from 80% to 94%. The total accuracy rate ranged from 87% to 96%. High-throughput FTIR spectroscopy combined with the SVM-LOOCV classification procedure appears to be a promising tool in the screening of galactosemia patients, with good sensitivity and specificity. Furthermore, this approach presents the advantages of being cost-effective, fast, and straightforward in the screening of galactosemic patients.

  13. 'PACLIMS': a component LIM system for high-throughput functional genomic analysis.

    Science.gov (United States)

    Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A

    2005-04-12

    Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the approximately 11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors.

  14. 'PACLIMS': A component LIM system for high-throughput functional genomic analysis

    Directory of Open Access Journals (Sweden)

    Farman Mark

    2005-04-01

    Full Text Available Abstract Background Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the ~11,000 predicted genes in the rice blast fungus (Magnaporthe grisea, an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. Results The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Conclusion Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput

  15. Towards high-throughput single cell/clone cultivation and analysis.

    Science.gov (United States)

    Lindström, Sara; Larsson, Rolf; Svahn, Helene Andersson

    2008-03-01

    In order to better understand cellular processes and behavior, a controlled way of studying high numbers of single cells and their clone formation is greatly needed. Numerous ways of ordering single cells into arrays have previously been described, but platforms in which each cell/clone can be addressed to an exact position in the microplate, cultivated for weeks and treated separately in a high-throughput manner have until now been missing. Here, a novel microplate developed for high-throughput single cell/clone cultivation and analysis is presented. Rapid single cell seeding into microwells, using conventional flow cytometry, allows several thousands of single cells to be cultivated, short-term (72 h) or long-term (10-14 days), and analyzed individually. By controlled sorting of individual cells to predefined locations in the microplate, analysis of single cell heterogeneity and clonogenic properties related to drug sensitivity can be accomplished. Additionally, the platform requires remarkably low number of cells, a major advantage when screening limited amounts of patient cell samples. By seeding single cells into the microplate it is possible to analyze the cells for over 14 generations, ending up with more than 10 000 cells in each well. Described here is a proof-of-concept on compartmentalization and cultivation of thousands of individual cells enabling heterogeneity analysis of various cells/clones and their response to different drugs.

  16. High-throughput alternative splicing detection using dually constrained correspondence analysis (DCCA).

    Science.gov (United States)

    Baty, Florent; Klingbiel, Dirk; Zappa, Francesco; Brutsche, Martin

    2015-12-01

    Alternative splicing is an important component of tumorigenesis. Recent advent of exon array technology enables the detection of alternative splicing at a genome-wide scale. The analysis of high-throughput alternative splicing is not yet standard and methodological developments are still needed. We propose a novel statistical approach-Dually Constrained Correspondence Analysis-for the detection of splicing changes in exon array data. Using this methodology, we investigated the genome-wide alteration of alternative splicing in patients with non-small cell lung cancer treated by bevacizumab/erlotinib. Splicing candidates reveal a series of genes related to carcinogenesis (SFTPB), cell adhesion (STAB2, PCDH15, HABP2), tumor aggressiveness (ARNTL2), apoptosis, proliferation and differentiation (PDE4D, FLT3, IL1R2), cell invasion (ETV1), as well as tumor growth (OLFM4, FGF14), tumor necrosis (AFF3) or tumor suppression (TUSC3, CSMD1, RHOBTB2, SERPINB5), with indication of known alternative splicing in a majority of genes. DCCA facilitates the identification of putative biologically relevant alternative splicing events in high-throughput exon array data.

  17. Compositional analysis: a valid approach to analyze microbiome high-throughput sequencing data.

    Science.gov (United States)

    Gloor, Gregory B; Reid, Gregor

    2016-08-01

    A workshop held at the 2015 annual meeting of the Canadian Society of Microbiologists highlighted compositional data analysis methods and the importance of exploratory data analysis for the analysis of microbiome data sets generated by high-throughput DNA sequencing. A summary of the content of that workshop, a review of new methods of analysis, and information on the importance of careful analyses are presented herein. The workshop focussed on explaining the rationale behind the use of compositional data analysis, and a demonstration of these methods for the examination of 2 microbiome data sets. A clear understanding of bioinformatics methodologies and the type of data being analyzed is essential, given the growing number of studies uncovering the critical role of the microbiome in health and disease and the need to understand alterations to its composition and function following intervention with fecal transplant, probiotics, diet, and pharmaceutical agents.

  18. Hydrogel Droplet Microfluidics for High-Throughput Single Molecule/Cell Analysis.

    Science.gov (United States)

    Zhu, Zhi; Yang, Chaoyong James

    2017-01-17

    Heterogeneity among individual molecules and cells has posed significant challenges to traditional bulk assays, due to the assumption of average behavior, which would lose important biological information in heterogeneity and result in a misleading interpretation. Single molecule/cell analysis has become an important and emerging field in biological and biomedical research for insights into heterogeneity between large populations at high resolution. Compared with the ensemble bulk method, single molecule/cell analysis explores the information on time trajectories, conformational states, and interactions of individual molecules/cells, all key factors in the study of chemical and biological reaction pathways. Various powerful techniques have been developed for single molecule/cell analysis, including flow cytometry, atomic force microscopy, optical and magnetic tweezers, single-molecule fluorescence spectroscopy, and so forth. However, some of them have the low-throughput issue that has to analyze single molecules/cells one by one. Flow cytometry is a widely used high-throughput technique for single cell analysis but lacks the ability for intercellular interaction study and local environment control. Droplet microfluidics becomes attractive for single molecule/cell manipulation because single molecules/cells can be individually encased in monodisperse microdroplets, allowing high-throughput analysis and manipulation with precise control of the local environment. Moreover, hydrogels, cross-linked polymer networks that swell in the presence of water, have been introduced into droplet microfluidic systems as hydrogel droplet microfluidics. By replacing an aqueous phase with a monomer or polymer solution, hydrogel droplets can be generated on microfluidic chips for encapsulation of single molecules/cells according to the Poisson distribution. The sol-gel transition property endows the hydrogel droplets with new functionalities and diversified applications in single

  19. High-throughput mouse phenotyping using non-rigid registration and robust principal component analysis

    Science.gov (United States)

    Xie, Zhongliu; Kitamoto, Asanobu; Tamura, Masaru; Shiroishi, Toshihiko; Gillies, Duncan

    2016-03-01

    Intensive international efforts are underway towards phenotyping the mouse genome, by knocking out each of its ≍25,000 genes one-by-one for comparative study. With vast amounts of data to analyze, the traditional method using time-consuming histological examination is clearly impractical, leading to an overwhelming demand for some high-throughput phenotyping framework, especially with the employment of biomedical image informatics to efficiently identify phenotypes concerning morphological abnormality. Existing work has either excessively relied on volumetric analytics which is insensitive to phenotypes associated with no severe volume variations, or tailored for specific defects and thus fails to serve a general phenotyping purpose. Furthermore, the prevailing requirement of an atlas for image segmentation in contrast to its limited availability further complicates the issue in practice. In this paper we propose a high-throughput general-purpose phenotyping framework that is able to efficiently perform batch-wise anomaly detection without prior knowledge of the phenotype and the need for atlas-based segmentation. Anomaly detection is centered on the combined use of group-wise non-rigid image registration and robust principal component analysis (RPCA) for feature extraction and decomposition.

  20. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    Science.gov (United States)

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p, small n' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  1. Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines

    Science.gov (United States)

    Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.

    2017-01-01

    Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.

  2. Making choices in health: WHO guide to cost effectiveness analysis

    National Research Council Canada - National Science Library

    Tan Torres Edejer, Tessa

    2003-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . XXI PART ONE: METHODS COST-EFFECTIVENESS FOR GENERALIZED ANALYSIS 1. 2. What is Generalized Cost-Effectiveness Analysis? . . . . . . . . . . . . 3 Undertaking...

  3. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  4. Cost-effectiveness analysis and innovation.

    Science.gov (United States)

    Jena, Anupam B; Philipson, Tomas J

    2008-09-01

    While cost-effectiveness (CE) analysis has provided a guide to allocating often scarce resources spent on medical technologies, less emphasis has been placed on the effect of such criteria on the behavior of innovators who make health care technologies available in the first place. A better understanding of the link between innovation and cost-effectiveness analysis is particularly important given the large role of technological change in the growth in health care spending and the growing interest of explicit use of CE thresholds in leading technology adoption in several Westernized countries. We analyze CE analysis in a standard market context, and stress that a technology's cost-effectiveness is closely related to the consumer surplus it generates. Improved CE therefore often clashes with interventions to stimulate producer surplus, such as patents. We derive the inconsistency between technology adoption based on CE analysis and economic efficiency. Indeed, static efficiency, dynamic efficiency, and improved patient health may all be induced by the cost-effectiveness of the technology being at its worst level. As producer appropriation of the social surplus of an innovation is central to the dynamic efficiency that should guide CE adoption criteria, we exemplify how appropriation can be inferred from existing CE estimates. For an illustrative sample of technologies considered, we find that the median technology has an appropriation of about 15%. To the extent that such incentives are deemed either too low or too high compared to dynamically efficient levels, CE thresholds may be appropriately raised or lowered to improve dynamic efficiency.

  5. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  6. A Quantitative High-Throughput Screening Data Analysis Pipeline for Activity Profiling.

    Science.gov (United States)

    Huang, Ruili

    2016-01-01

    The US Tox21 program has developed in vitro assays to test large collections of environmental chemicals in a quantitative high-throughput screening (qHTS) format, using triplicate 15-dose titrations to generate over 50 million data points to date. Counter screens are also employed to minimize interferences from non-target-specific assay artifacts, such as compound auto fluorescence and cytotoxicity. New data analysis approaches are needed to integrate these data and characterize the activities observed from these assays. Here, we describe a complete analysis pipeline that evaluates these qHTS data for technical quality in terms of signal reproducibility. We integrate signals from repeated assay runs, primary readouts, and counter screens to produce a final call on on-target compound activity.

  7. High-throughput FTIR-based bioprocess analysis of recombinant cyprosin production.

    Science.gov (United States)

    Sampaio, Pedro N; Sales, Kevin C; Rosa, Filipa O; Lopes, Marta B; Calado, Cecília R C

    2017-01-01

    To increase the knowledge of the recombinant cyprosin production process in Saccharomyces cerevisiae cultures, it is relevant to implement efficient bioprocess monitoring techniques. The present work focuses on the implementation of a mid-infrared (MIR) spectroscopy-based tool for monitoring the recombinant culture in a rapid, economic, and high-throughput (using a microplate system) mode. Multivariate data analysis on the MIR spectra of culture samples was conducted. Principal component analysis (PCA) enabled capturing the general metabolic status of the yeast cells, as replicated samples appear grouped together in the score plot and groups of culture samples according to the main growth phase can be clearly distinguished. The PCA-loading vectors also revealed spectral regions, and the corresponding chemical functional groups and biomolecules that mostly contributed for the cell biomolecular fingerprint associated with the culture growth phase. These data were corroborated by the analysis of the samples' second derivative spectra. Partial least square (PLS) regression models built based on the MIR spectra showed high predictive ability for estimating the bioprocess critical variables: biomass (R (2) = 0.99, RMSEP 2.8%); cyprosin activity (R (2) = 0.98, RMSEP 3.9%); glucose (R (2) = 0.93, RMSECV 7.2%); galactose (R (2) = 0.97, RMSEP 4.6%); ethanol (R (2) = 0.97, RMSEP 5.3%); and acetate (R (2) = 0.95, RMSEP 7.0%). In conclusion, high-throughput MIR spectroscopy and multivariate data analysis were effective in identifying the main growth phases and specific cyprosin production phases along the yeast culture as well as in quantifying the critical variables of the process. This knowledge will promote future process optimization and control the recombinant cyprosin bioprocess according to Quality by Design framework.

  8. STAMP: Extensions to the STADEN sequence analysis package for high throughput interactive microsatellite marker design.

    Science.gov (United States)

    Kraemer, Lars; Beszteri, Bánk; Gäbler-Schwarz, Steffi; Held, Christoph; Leese, Florian; Mayer, Christoph; Pöhlmann, Kevin; Frickenhaus, Stephan

    2009-01-30

    Microsatellites (MSs) are DNA markers with high analytical power, which are widely used in population genetics, genetic mapping, and forensic studies. Currently available software solutions for high-throughput MS design (i) have shortcomings in detecting and distinguishing imperfect and perfect MSs, (ii) lack often necessary interactive design steps, and (iii) do not allow for the development of primers for multiplex amplifications. We present a set of new tools implemented as extensions to the STADEN package, which provides the backbone functionality for flexible sequence analysis workflows. The possibility to assemble overlapping reads into unique contigs (provided by the base functionality of the STADEN package) is important to avoid developing redundant markers, a feature missing from most other similar tools. Our extensions to the STADEN package provide the following functionality to facilitate microsatellite (and also minisatellite) marker design: The new modules (i) integrate the state-of-the-art tandem repeat detection and analysis software PHOBOS into workflows, (ii) provide two separate repeat detection steps - with different search criteria - one for masking repetitive regions during assembly of sequencing reads and the other for designing repeat-flanking primers for MS candidate loci, (iii) incorporate the widely used primer design program PRIMER3 into STADEN workflows, enabling the interactive design and visualization of flanking primers for microsatellites, and (iv) provide the functionality to find optimal locus- and primer pair combinations for multiplex primer design. Furthermore, our extensions include a module for storing analysis results in an SQLite database, providing a transparent solution for data access from within as well as from outside of the STADEN Package. The STADEN package is enhanced by our modules into a highly flexible, high-throughput, interactive tool for conventional and multiplex microsatellite marker design. It gives the user

  9. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  10. Quantitative high-throughput analysis of drugs in biological matrices by mass spectrometry.

    Science.gov (United States)

    Hopfgartner, Gérard; Bourgogne, Emmanuel

    2003-01-01

    To support pharmacokinetic and drug metabolism studies, LC-MS/MS plays more and more an essential role for the quantitation of drugs and their metabolites in biological matrices. With the new challenges encountered in drug discovery and drug development, new strategies are put in place to achieve high-throughput analysis, using serial and parallel approaches. To speed-up method development and validation, generic approaches with the direct injection of biological fluids is highly desirable. Column-switching, using various packing materials for the extraction columns, is widely applied. Improvement of mass spectrometers performance, and in particular triple quadrupoles, also strongly influences sample preparation strategies, which remain a key element in the bioanalytical process. Copyright 2003 Wiley Periodicals, Inc., Mass Spec Rev 22:195-214, 2003; Published online in Wiley Interscience (www.interscience.wiley.com). DOI 10.1002/mas.10050

  11. Resonant waveguide grating imagers for single cell analysis and high throughput screening

    Science.gov (United States)

    Fang, Ye

    2015-08-01

    Resonant waveguide grating (RWG) systems illuminate an array of diffractive nanograting waveguide structures in microtiter plate to establish evanescent wave for measuring tiny changes in local refractive index arising from the dynamic mass redistribution of living cells upon stimulation. Whole-plate RWG imager enables high-throughput profiling and screening of drugs. Microfluidics RWG imager not only manifests distinct receptor signaling waves, but also differentiates long-acting agonism and antagonism. Spatially resolved RWG imager allows for single cell analysis including receptor signaling heterogeneity and the invasion of cancer cells in a spheroidal structure through 3-dimensional extracellular matrix. High frequency RWG imager permits real-time detection of drug-induced cardiotoxicity. The wide coverage in target, pathway, assay, and cell phenotype has made RWG systems powerful tool in both basic research and early drug discovery process.

  12. Construction and analysis of an integrated regulatory network derived from high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Chao Cheng

    2011-11-01

    Full Text Available We present a network framework for analyzing multi-level regulation in higher eukaryotes based on systematic integration of various high-throughput datasets. The network, namely the integrated regulatory network, consists of three major types of regulation: TF→gene, TF→miRNA and miRNA→gene. We identified the target genes and target miRNAs for a set of TFs based on the ChIP-Seq binding profiles, the predicted targets of miRNAs using annotated 3'UTR sequences and conservation information. Making use of the system-wide RNA-Seq profiles, we classified transcription factors into positive and negative regulators and assigned a sign for each regulatory interaction. Other types of edges such as protein-protein interactions and potential intra-regulations between miRNAs based on the embedding of miRNAs in their host genes were further incorporated. We examined the topological structures of the network, including its hierarchical organization and motif enrichment. We found that transcription factors downstream of the hierarchy distinguish themselves by expressing more uniformly at various tissues, have more interacting partners, and are more likely to be essential. We found an over-representation of notable network motifs, including a FFL in which a miRNA cost-effectively shuts down a transcription factor and its target. We used data of C. elegans from the modENCODE project as a primary model to illustrate our framework, but further verified the results using other two data sets. As more and more genome-wide ChIP-Seq and RNA-Seq data becomes available in the near future, our methods of data integration have various potential applications.

  13. Transcriptome analysis of the silkworm (Bombyx mori) by high-throughput RNA sequencing.

    Science.gov (United States)

    Li, Yinü; Wang, Guozeng; Tian, Jian; Liu, Huifen; Yang, Huipeng; Yi, Yongzhu; Wang, Jinhui; Shi, Xiaofeng; Jiang, Feng; Yao, Bin; Zhang, Zhifang

    2012-01-01

    The domestic silkworm, Bombyx mori, is a model insect with important economic value for silk production that also acts as a bioreactor for biomaterial production. The functional complexity of the silkworm transcriptome has not yet been fully elucidated, although genomic sequencing and other tools have been widely used in its study. We explored the transcriptome of silkworm at different developmental stages using high-throughput paired-end RNA sequencing. A total of about 3.3 gigabases (Gb) of sequence was obtained, representing about a 7-fold coverage of the B. mori genome. From the reads that were mapped to the genome sequence; 23,461 transcripts were obtained, 5,428 of them were novel. Of the 14,623 predicted protein-coding genes in the silkworm genome database, 11,884 of them were found to be expressed in the silkworm transcriptome, giving a coverage of 81.3%. A total of 13,195 new exons were detected, of which, 5,911 were found in the annotated genes in the Silkworm Genome Database (SilkDB). An analysis of alternative splicing in the transcriptome revealed that 3,247 genes had undergone alternative splicing. To help with the data analysis, a transcriptome database that integrates our transcriptome data with the silkworm genome data was constructed and is publicly available at http://124.17.27.136/gbrowse2/. To our knowledge, this is the first study to elucidate the silkworm transcriptome using high-throughput RNA sequencing technology. Our data indicate that the transcriptome of silkworm is much more complex than previously anticipated. This work provides tools and resources for the identification of new functional elements and paves the way for future functional genomics studies.

  14. PTMScout, a Web resource for analysis of high throughput post-translational proteomics studies.

    Science.gov (United States)

    Naegle, Kristen M; Gymrek, Melissa; Joughin, Brian A; Wagner, Joel P; Welsch, Roy E; Yaffe, Michael B; Lauffenburger, Douglas A; White, Forest M

    2010-11-01

    The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu.

  15. Efficient statistical significance approximation for local similarity analysis of high-throughput time series data.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob; Fuhrman, Jed A; Sun, Fengzhu

    2013-01-15

    Local similarity analysis of biological time series data helps elucidate the varying dynamics of biological systems. However, its applications to large scale high-throughput data are limited by slow permutation procedures for statistical significance evaluation. We developed a theoretical approach to approximate the statistical significance of local similarity analysis based on the approximate tail distribution of the maximum partial sum of independent identically distributed (i.i.d.) random variables. Simulations show that the derived formula approximates the tail distribution reasonably well (starting at time points > 10 with no delay and > 20 with delay) and provides P-values comparable with those from permutations. The new approach enables efficient calculation of statistical significance for pairwise local similarity analysis, making possible all-to-all local association studies otherwise prohibitive. As a demonstration, local similarity analysis of human microbiome time series shows that core operational taxonomic units (OTUs) are highly synergetic and some of the associations are body-site specific across samples. The new approach is implemented in our eLSA package, which now provides pipelines for faster local similarity analysis of time series data. The tool is freely available from eLSA's website: http://meta.usc.edu/softs/lsa. Supplementary data are available at Bioinformatics online. fsun@usc.edu.

  16. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists.

  17. High-throughput and automated diagnosis of antimicrobial resistance using a cost-effective cellphone-based micro-plate reader.

    Science.gov (United States)

    Feng, Steve; Tseng, Derek; Di Carlo, Dino; Garner, Omai B; Ozcan, Aydogan

    2016-12-15

    Routine antimicrobial susceptibility testing (AST) can prevent deaths due to bacteria and reduce the spread of multi-drug-resistance, but cannot be regularly performed in resource-limited-settings due to technological challenges, high-costs, and lack of trained professionals. We demonstrate an automated and cost-effective cellphone-based 96-well microtiter-plate (MTP) reader, capable of performing AST without the need for trained diagnosticians. Our system includes a 3D-printed smartphone attachment that holds and illuminates the MTP using a light-emitting-diode array. An inexpensive optical fiber-array enables the capture of the transmitted light of each well through the smartphone camera. A custom-designed application sends the captured image to a server to automatically determine well-turbidity, with results returned to the smartphone in ~1 minute. We tested this mobile-reader using MTPs prepared with 17 antibiotics targeting Gram-negative bacteria on clinical isolates of Klebsiella pneumoniae, containing highly-resistant antimicrobial profiles. Using 78 patient isolate test-plates, we demonstrated that our mobile-reader meets the FDA-defined AST criteria, with a well-turbidity detection accuracy of 98.21%, minimum-inhibitory-concentration accuracy of 95.12%, and a drug-susceptibility interpretation accuracy of 99.23%, with no very major errors. This mobile-reader could eliminate the need for trained diagnosticians to perform AST, reduce the cost-barrier for routine testing, and assist in spatio-temporal tracking of bacterial resistance.

  18. High-throughput and automated diagnosis of antimicrobial resistance using a cost-effective cellphone-based micro-plate reader

    Science.gov (United States)

    Feng, Steve; Tseng, Derek; di Carlo, Dino; Garner, Omai B.; Ozcan, Aydogan

    2016-12-01

    Routine antimicrobial susceptibility testing (AST) can prevent deaths due to bacteria and reduce the spread of multi-drug-resistance, but cannot be regularly performed in resource-limited-settings due to technological challenges, high-costs, and lack of trained professionals. We demonstrate an automated and cost-effective cellphone-based 96-well microtiter-plate (MTP) reader, capable of performing AST without the need for trained diagnosticians. Our system includes a 3D-printed smartphone attachment that holds and illuminates the MTP using a light-emitting-diode array. An inexpensive optical fiber-array enables the capture of the transmitted light of each well through the smartphone camera. A custom-designed application sends the captured image to a server to automatically determine well-turbidity, with results returned to the smartphone in ~1 minute. We tested this mobile-reader using MTPs prepared with 17 antibiotics targeting Gram-negative bacteria on clinical isolates of Klebsiella pneumoniae, containing highly-resistant antimicrobial profiles. Using 78 patient isolate test-plates, we demonstrated that our mobile-reader meets the FDA-defined AST criteria, with a well-turbidity detection accuracy of 98.21%, minimum-inhibitory-concentration accuracy of 95.12%, and a drug-susceptibility interpretation accuracy of 99.23%, with no very major errors. This mobile-reader could eliminate the need for trained diagnosticians to perform AST, reduce the cost-barrier for routine testing, and assist in spatio-temporal tracking of bacterial resistance.

  19. High-throughput microfluidic device for single cell analysis using multiple integrated soft lithographic pumps.

    Science.gov (United States)

    Patabadige, Damith E W; Mickleburgh, Tom; Ferris, Lorin; Brummer, Gage; Culbertson, Anne H; Culbertson, Christopher T

    2016-05-01

    The ability to accurately control fluid transport in microfluidic devices is key for developing high-throughput methods for single cell analysis. Making small, reproducible changes to flow rates, however, to optimize lysis and injection using pumps external to the microfluidic device are challenging and time-consuming. To improve the throughput and increase the number of cells analyzed, we have integrated previously reported micropumps into a microfluidic device that can increase the cell analysis rate to ∼1000 cells/h and operate for over an hour continuously. In order to increase the flow rates sufficiently to handle cells at a higher throughput, three sets of pumps were multiplexed. These pumps are simple, low-cost, durable, easy to fabricate, and biocompatible. They provide precise control of the flow rate up to 9.2 nL/s. These devices were used to automatically transport, lyse, and electrophoretically separate T-Lymphocyte cells loaded with Oregon green and 6-carboxyfluorescein. Peak overlap statistics predicted the number of fully resolved single-cell electropherograms seen. In addition, there was no change in the average fluorescent dye peak areas indicating that the cells remained intact and the dyes did not leak out of the cells over the 1 h analysis time. The cell lysate peak area distribution followed that expected of an asynchronous steady-state population of immortalized cells.

  20. PhenStat: A Tool Kit for Standardized Analysis of High Throughput Phenotypic Data.

    Directory of Open Access Journals (Sweden)

    Natalja Kurbatova

    Full Text Available The lack of reproducibility with animal phenotyping experiments is a growing concern among the biomedical community. One contributing factor is the inadequate description of statistical analysis methods that prevents researchers from replicating results even when the original data are provided. Here we present PhenStat--a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations. The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation. PhenStat is targeted to two user groups: small-scale users who wish to interact and test data from large resources and large-scale users who require an automated statistical analysis pipeline. The software provides guidance to the user for selecting appropriate analysis methods based on the dataset and is designed to allow for additions and modifications as needed. The package was tested on mouse and rat data and is used by the International Mouse Phenotyping Consortium (IMPC. By providing raw data and the version of PhenStat used, resources like the IMPC give users the ability to replicate and explore results within their own computing environment.

  1. PhenStat: A Tool Kit for Standardized Analysis of High Throughput Phenotypic Data.

    Science.gov (United States)

    Kurbatova, Natalja; Mason, Jeremy C; Morgan, Hugh; Meehan, Terrence F; Karp, Natasha A

    2015-01-01

    The lack of reproducibility with animal phenotyping experiments is a growing concern among the biomedical community. One contributing factor is the inadequate description of statistical analysis methods that prevents researchers from replicating results even when the original data are provided. Here we present PhenStat--a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations. The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation. PhenStat is targeted to two user groups: small-scale users who wish to interact and test data from large resources and large-scale users who require an automated statistical analysis pipeline. The software provides guidance to the user for selecting appropriate analysis methods based on the dataset and is designed to allow for additions and modifications as needed. The package was tested on mouse and rat data and is used by the International Mouse Phenotyping Consortium (IMPC). By providing raw data and the version of PhenStat used, resources like the IMPC give users the ability to replicate and explore results within their own computing environment.

  2. FASTAptamer: A Bioinformatic Toolkit for High-throughput Sequence Analysis of Combinatorial Selections

    Directory of Open Access Journals (Sweden)

    Khalid K Alam

    2015-01-01

    Full Text Available High-throughput sequence (HTS analysis of combinatorial selection populations accelerates lead discovery and optimization and offers dynamic insight into selection processes. An underlying principle is that selection enriches high-fitness sequences as a fraction of the population, whereas low-fitness sequences are depleted. HTS analysis readily provides the requisite numerical information by tracking the evolutionary trajectory of individual sequences in response to selection pressures. Unlike genomic data, for which a number of software solutions exist, user-friendly tools are not readily available for the combinatorial selections field, leading many users to create custom software. FASTAptamer was designed to address the sequence-level analysis needs of the field. The open source FASTAptamer toolkit counts, normalizes and ranks read counts in a FASTQ file, compares populations for sequence distribution, generates clusters of sequence families, calculates fold-enrichment of sequences throughout the course of a selection and searches for degenerate sequence motifs. While originally designed for aptamer selections, FASTAptamer can be applied to any selection strategy that can utilize next-generation DNA sequencing, such as ribozyme or deoxyribozyme selections, in vivo mutagenesis and various surface display technologies (peptide, antibody fragment, mRNA, etc.. FASTAptamer software, sample data and a user's guide are available for download at http://burkelab.missouri.edu/fastaptamer.html.

  3. A high-throughput DNA methylation analysis of a single cell.

    Science.gov (United States)

    Kantlehner, Martin; Kirchner, Roland; Hartmann, Petra; Ellwart, Joachim W; Alunni-Fabbroni, Marianna; Schumacher, Axel

    2011-04-01

    In recent years, the field of epigenetics has grown dramatically and has become one of the most dynamic and fast-growing branches of molecular biology. The amount of diseases suspected of being influenced by DNA methylation is rising steadily and includes common diseases such as schizophrenia, bipolar disorder, Alzheimer's disease, diabetes, atherosclerosis, cancer, major psychosis, lupus and Parkinson's disease. Due to cellular heterogeneity of methylation patterns, epigenetic analyses of single cells become a necessity. One rationale is that DNA methylation profiles are highly variable across individual cells, even in the same organ, dependent on the function of the gene, disease state, exposure to environmental factors (e.g. radiation, drugs or nutrition), stochastic fluctuations and various other causes. Using a polymerase chain reaction (PCR)-slide microreaction system, we present here a methylation-sensitive PCR analysis, the restriction enzyme-based single-cell methylation assay (RSMA), in the analysis of DNA methylation patterns in single cells. This method addresses the problems of cell heterogeneity in epigenetics research; it is comparably affordable, avoids complicated microfluidic systems and offers the opportunity for high-throughput screening, as many single cells can be screened in parallel. In addition to this study, critical principles and caveats of single cell methylation analyses are discussed.

  4. Quantitative criteria for improving performance of buccal DNA for high-throughput genetic analysis

    Directory of Open Access Journals (Sweden)

    Woo Jessica G

    2012-08-01

    Full Text Available Abstract Background DNA from buccal brush samples is being used for high-throughput analyses in a variety of applications, but the impact of sample type on genotyping success and downstream statistical analysis remains unclear. The objective of the current study was to determine laboratory predictors of genotyping failure among buccal DNA samples, and to evaluate the successfully genotyped results with respect to analytic quality control metrics. Sample and genotyping characteristics were compared between buccal and blood samples collected in the population-based Genetic and Environmental Risk Factors for Hemorrhagic Stroke (GERFHS study (https://gerfhs.phs.wfubmc.edu/public/index.cfm. Results Seven-hundred eight (708 buccal and 142 blood DNA samples were analyzed for laboratory-based and analysis metrics. Overall genotyping failure rates were not statistically different between buccal (11.3% and blood (7.0%, p = 0.18 samples; however, both the Contrast Quality Control (cQC rate and the dynamic model (DM call rates were lower among buccal DNA samples (p  Conclusions We identified a buccal sample characteristic, a ratio of ds/total DNA

  5. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    Science.gov (United States)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  6. High-throughput lipidomic analysis of fatty acid derived eicosanoids and N-acylethanolamines.

    Science.gov (United States)

    Dumlao, Darren S; Buczynski, Matthew W; Norris, Paul C; Harkewicz, Richard; Dennis, Edward A

    2011-11-01

    Fatty acid-derived eicosanoids and N-acylethanolamines (NAE) are important bioactive lipid mediators involved in numerous biological processes including cell signaling and disease progression. To facilitate research on these lipid mediators, we have developed a targeted high-throughput mass spectrometric based methodology to monitor and quantitate both eicosanoids and NAEs, and can be analyzed separately or together in series. Each methodology utilizes scheduled multiple reaction monitoring (sMRM) pairs in conjunction with a 25 min reverse-phase HPLC separation. The eicosanoid methodology monitors 141 unique metabolites and quantitative amounts can be determined for over 100 of these metabolites against standards. The analysis covers eicosanoids generated from cycloxygenase, lipoxygenase, cytochrome P450 enzymes, and those generated from non-enzymatic pathways. The NAE analysis monitors 36 metabolites and quantitative amounts can be determined for 33 of these metabolites against standards. The NAE method contains metabolites derived from saturated fatty acids, unsaturated fatty acids, and eicosanoids. The lower limit of detection for eicosanoids ranges from 0.1pg to 1pg, while NAEs ranges from 0.1pg to 1000pg. The rationale and design of the methodology is discussed.

  7. Design and analysis of experiments with high throughput biological assay data.

    Science.gov (United States)

    Rocke, David M

    2004-12-01

    The design and analysis of experiments using gene expression microarrays is a topic of considerable current research, and work is beginning to appear on the analysis of proteomics and metabolomics data by mass spectrometry and NMR spectroscopy. The literature in this area is evolving rapidly, and commercial software for analysis of array or proteomics data is rarely up to date, and is essentially nonexistent for metabolomics data. In this paper, I review some of the issues that should concern any biologists planning to use such high-throughput biological assay data in an experimental investigation. Technical details are kept to a minimum, and may be found in the referenced literature, as well as in the many excellent papers which space limitations prevent my describing. There are usually a number of viable options for design and analysis of such experiments, but unfortunately, there are even more non-viable ones that have been used even in the published literature. This is an area in which up-to-date knowledge of the literature is indispensable for efficient and effective design and analysis of these experiments. In general, we concentrate on relatively simple analyses, often focusing on identifying differentially expressed genes and the comparable issues in mass spectrometry and NMR spectroscopy (consistent differences in peak heights or areas for example). Complex multivariate and pattern recognition methods also need much attention, but the issues we describe in this paper must be dealt with first. The literature on analysis of proteomics and metabolomics data is as yet sparse, so the main focus of this paper will be on methods devised for analysis of gene expression data that generalize to proteomics and metabolomics, with some specific comments near the end on analysis of metabolomics data by mass spectrometry and NMR spectroscopy.

  8. ScreenMill: A freely available software suite for growth measurement, analysis and visualization of high-throughput screen data

    Directory of Open Access Journals (Sweden)

    Rothstein Rodney

    2010-06-01

    Full Text Available Abstract Background Many high-throughput genomic experiments, such as Synthetic Genetic Array and yeast two-hybrid, use colony growth on solid media as a screen metric. These experiments routinely generate over 100,000 data points, making data analysis a time consuming and painstaking process. Here we describe ScreenMill, a new software suite that automates image analysis and simplifies data review and analysis for high-throughput biological experiments. Results The ScreenMill, software suite includes three software tools or "engines": an open source Colony Measurement Engine (CM Engine to quantitate colony growth data from plate images, a web-based Data Review Engine (DR Engine to validate and analyze quantitative screen data, and a web-based Statistics Visualization Engine (SV Engine to visualize screen data with statistical information overlaid. The methods and software described here can be applied to any screen in which growth is measured by colony size. In addition, the DR Engine and SV Engine can be used to visualize and analyze other types of quantitative high-throughput data. Conclusions ScreenMill automates quantification, analysis and visualization of high-throughput screen data. The algorithms implemented in ScreenMill are transparent allowing users to be confident about the results ScreenMill produces. Taken together, the tools of ScreenMill offer biologists a simple and flexible way of analyzing their data, without requiring programming skills.

  9. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  10. Bacterioplankton community analysis in tilapia ponds by Illumina high-throughput sequencing.

    Science.gov (United States)

    Fan, Li Min; Barry, Kamira; Hu, Geng Dong; Meng, Shun long; Song, Chao; Wu, Wei; Chen, Jia Zhang; Xu, Pao

    2016-01-01

    The changes of microbial community in aquaculture systems under the effects of stocking densities and seasonality were investigated in tilapia ponds. Total DNAs were extracted from the water samples, 16S rRNA gene was amplified and the bacterial community analyzed by Illumina high-throughput sequencing obtaining 3486 OTUs, from a total read of 715,842 sequences. Basing on the analysis of bacterial compositions, richness, diversity, bacterial 16S rRNA gene abundance, water sample comparisons and existence of specific bacterial taxa within three fish ponds in a 4 months period, the study conclusively observed that the dominant phylum in all water samples were similar, and they included; Proteobacteria, Cyanobacteria, Bacteroidetes, Actinobacteria, Planctomycetes and Chlorobi, distributed in different proportions in the different months and ponds. The seasonal changes had a more pronounced effect on the bacterioplankton community than the stocking densities; however some differences between the ponds were more likely caused by feed coefficient than by stocking densities. At the same time, most bacterial communities were affected by the nutrient input except phylum Cyanobacteria that was also affected by the feed control of tilapia.

  11. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes- neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  12. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  13. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    Science.gov (United States)

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and sample analysis throughput suitable for broad applications in life sciences, agricultural chemistry, consumer safety, and beyond.

  14. High-throughput Analysis of Large Microscopy Image Datasets on CPU-GPU Cluster Platforms.

    Science.gov (United States)

    Teodoro, George; Pan, Tony; Kurc, Tahsin M; Kong, Jun; Cooper, Lee A D; Podhorszki, Norbert; Klasky, Scott; Saltz, Joel H

    2013-05-01

    Analysis of large pathology image datasets offers significant opportunities for the investigation of disease morphology, but the resource requirements of analysis pipelines limit the scale of such studies. Motivated by a brain cancer study, we propose and evaluate a parallel image analysis application pipeline for high throughput computation of large datasets of high resolution pathology tissue images on distributed CPU-GPU platforms. To achieve efficient execution on these hybrid systems, we have built runtime support that allows us to express the cancer image analysis application as a hierarchical data processing pipeline. The application is implemented as a coarse-grain pipeline of stages, where each stage may be further partitioned into another pipeline of fine-grain operations. The fine-grain operations are efficiently managed and scheduled for computation on CPUs and GPUs using performance aware scheduling techniques along with several optimizations, including architecture aware process placement, data locality conscious task assignment, data prefetching, and asynchronous data copy. These optimizations are employed to maximize the utilization of the aggregate computing power of CPUs and GPUs and minimize data copy overheads. Our experimental evaluation shows that the cooperative use of CPUs and GPUs achieves significant improvements on top of GPU-only versions (up to 1.6×) and that the execution of the application as a set of fine-grain operations provides more opportunities for runtime optimizations and attains better performance than coarser-grain, monolithic implementations used in other works. An implementation of the cancer image analysis pipeline using the runtime support was able to process an image dataset consisting of 36,848 4Kx4K-pixel image tiles (about 1.8TB uncompressed) in less than 4 minutes (150 tiles/second) on 100 nodes of a state-of-the-art hybrid cluster system.

  15. High-throughput analysis of total nitrogen content that replaces the classic Kjeldahl method.

    Science.gov (United States)

    Yasuhara, T; Nokihara, K

    2001-10-01

    A high-throughput method for determination of total nitrogen content has been developed. The method involves decomposition of samples, followed by trapping and quantitative colorimetric determination of the resulting ammonia. The present method is rapid, facile, and economical. Thus, it can replace the classic Kjeldahl method through its higher efficiency for determining multiple samples. Compared to the classic method, the present method is economical and environmentally friendly. Based on the present method, a novel reactor was constructed to realize routine high-throughput analyses of multiple samples such as those found for pharmaceutical materials, foods, and/or excrements.

  16. WholePathwayScope: a comprehensive pathway-based analysis tool for high-throughput data

    Directory of Open Access Journals (Sweden)

    Cohen Jonathan C

    2006-01-01

    Full Text Available Abstract Background Analysis of High Throughput (HTP Data such as microarray and proteomics data has provided a powerful methodology to study patterns of gene regulation at genome scale. A major unresolved problem in the post-genomic era is to assemble the large amounts of data generated into a meaningful biological context. We have developed a comprehensive software tool, WholePathwayScope (WPS, for deriving biological insights from analysis of HTP data. Result WPS extracts gene lists with shared biological themes through color cue templates. WPS statistically evaluates global functional category enrichment of gene lists and pathway-level pattern enrichment of data. WPS incorporates well-known biological pathways from KEGG (Kyoto Encyclopedia of Genes and Genomes and Biocarta, GO (Gene Ontology terms as well as user-defined pathways or relevant gene clusters or groups, and explores gene-term relationships within the derived gene-term association networks (GTANs. WPS simultaneously compares multiple datasets within biological contexts either as pathways or as association networks. WPS also integrates Genetic Association Database and Partial MedGene Database for disease-association information. We have used this program to analyze and compare microarray and proteomics datasets derived from a variety of biological systems. Application examples demonstrated the capacity of WPS to significantly facilitate the analysis of HTP data for integrative discovery. Conclusion This tool represents a pathway-based platform for discovery integration to maximize analysis power. The tool is freely available at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  17. Multiplex high-throughput gene mutation analysis in acute myeloid leukemia.

    Science.gov (United States)

    Dunlap, Jennifer; Beadling, Carol; Warrick, Andrea; Neff, Tanaya; Fleming, William H; Loriaux, Marc; Heinrich, Michael C; Kovacsovics, Tibor; Kelemen, Katalin; Leeborg, Nicky; Gatter, Ken; Braziel, Rita M; Press, Richard; Corless, Christopher L; Fan, Guang

    2012-12-01

    Classification of acute myeloid leukemia increasingly depends on genetic analysis. However, the number of known mutations in acute myeloid leukemia is expanding rapidly. Therefore, we tested a high-throughput screening method for acute myeloid leukemia mutation analysis using a multiplex mass spectrometry-based approach. To our knowledge, this is the first reported application of this approach to genotype leukemias in a clinical setting. One hundred seven acute myeloid leukemia cases were screened for mutations using a panel that covers 344 point mutations across 31 genes known to be associated with leukemia. The analysis was performed by multiplex polymerase chain reaction for mutations in genes of interest followed by primer extension reactions. Products were analyzed on a Sequenom MassARRAY system (San Diego, CA). The multiplex panel yielded mutations in 58% of acute myeloid leukemia cases with normal cytogenetics and 21% of cases with abnormal cytogenetics. Cytogenetics and routine polymerase chain reaction-based screening of NPM1, CEBPA, FLT3-ITD, and KIT was also performed on a subset of cases. When combined with the results of these standard polymerase chain reaction-based tests, the mutation frequency reached 78% in cases with normal cytogenetics. Of these, 42% harbored multiple mutations primarily involving NPM1 with NRAS, KRAS, CEBPA, PTPN11, IDH1, or FLT3. In contrast, cases with abnormal cytogenetics rarely harbored more than 1 mutation (1.5%), suggesting different underlying biology. This study demonstrates the feasibility and utility of broad-based mutation profiling of acute myeloid leukemia in a clinical setting. This approach will be helpful in defining prognostic subgroups of acute myeloid leukemia and contribute to the selection of patients for enrollment into trials with novel inhibitors.

  18. ZebraZoom: an automated program for high-throughput behavioral analysis and categorization.

    Science.gov (United States)

    Mirat, Olivier; Sternberg, Jenna R; Severi, Kristen E; Wyart, Claire

    2013-01-01

    The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorized all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed with four experimenters in 73.2-82.5% of cases. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva-larva interactions occurred as series of escapes. Overall, ZebraZoom reached the level of precision found in manual analysis but accomplished tasks in a high-throughput format necessary for large screens.

  19. ZebraZoom: an automated program for high-throughput behavioral analysis and categorization

    Directory of Open Access Journals (Sweden)

    Olivier eMirat

    2013-06-01

    Full Text Available The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 (lak revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorizes all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed 73.2-82.5% of cases with four independent experimenters. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva-larva interactions occurred as series of escapes. Overall, ZebraZoom reaches the level of precision found in manual analysis but accomplishes tasks in a high-throughput format necessary for large screens.

  20. Ontology-based meta-analysis of global collections of high-throughput public data.

    Directory of Open Access Journals (Sweden)

    Ilya Kupershmidt

    Full Text Available BACKGROUND: The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. METHODOLOGY/RESULTS: We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. CONCLUSIONS: Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  1. WormSizer: high-throughput analysis of nematode size and shape.

    Directory of Open Access Journals (Sweden)

    Brad T Moore

    Full Text Available The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy mutant is short and fat and that a Long (Lon mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc and Roller (Rol mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.

  2. Comprehensive analysis of high-throughput screens with HiTSeekR

    DEFF Research Database (Denmark)

    List, Markus; Schmidt, Steffen; Christiansen, Helle;

    2016-01-01

    High-throughput screening (HTS) is an indispensable tool for drug (target) discovery that currently lacks user-friendly software tools for the robust identification of putative hits from HTS experiments and for the interpretation of these findings in the context of systems biology. We developed H...

  3. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based up

  4. Cancer panomics: computational methods and infrastructure for integrative analysis of cancer high-throughput "omics" data

    DEFF Research Database (Denmark)

    Brunak, Søren; De La Vega, Francisco M.; Rätsch, Gunnar

    2014-01-01

    Targeted cancer treatment is becoming the goal of newly developed oncology medicines and has already shown promise in some spectacular cases such as the case of BRAF kinase inhibitors in BRAF-mutant (e.g. V600E) melanoma. These developments are driven by the advent of high-throughput sequencing...

  5. High-throughput semiquantitative analysis of insertional mutations in heterogeneous tumors

    NARCIS (Netherlands)

    Koudijs, M.J.; Klijn, C.; van der Weyden, L.; Kool, J.; ten Hoeve, J.; Sie, D.; Prasetyanti, P.R.; Schut, E.; Kas, S.; Whipp, T.; Cuppen, E.; Wessels, L.; Adams, D.J.; Jonkers, J.

    2011-01-01

    Retroviral and transposon-based insertional mutagenesis (IM) screens are widely used for cancer gene discovery in mice. Exploiting the full potential of IM screens requires methods for high-throughput sequencing and mapping of transposon and retroviral insertion sites. Current protocols are based on

  6. Evaluation of polymeric gene delivery nanoparticles by nanoparticle tracking analysis and high-throughput flow cytometry.

    Science.gov (United States)

    Shmueli, Ron B; Bhise, Nupura S; Green, Jordan J

    2013-03-01

    Non-viral gene delivery using polymeric nanoparticles has emerged as an attractive approach for gene therapy to treat genetic diseases(1) and as a technology for regenerative medicine(2). Unlike viruses, which have significant safety issues, polymeric nanoparticles can be designed to be non-toxic, non-immunogenic, non-mutagenic, easier to synthesize, chemically versatile, capable of carrying larger nucleic acid cargo and biodegradable and/or environmentally responsive. Cationic polymers self-assemble with negatively charged DNA via electrostatic interaction to form complexes on the order of 100 nm that are commonly termed polymeric nanoparticles. Examples of biomaterials used to form nanoscale polycationic gene delivery nanoparticles include polylysine, polyphosphoesters, poly(amidoamines)s and polyethylenimine (PEI), which is a non-degradable off-the-shelf cationic polymer commonly used for nucleic acid delivery(1,3) . Poly(beta-amino ester)s (PBAEs) are a newer class of cationic polymers(4) that are hydrolytically degradable(5,6) and have been shown to be effective at gene delivery to hard-to-transfect cell types such as human retinal endothelial cells (HRECs)(7), mouse mammary epithelial cells(8), human brain cancer cells(9) and macrovascular (human umbilical vein, HUVECs) endothelial cells(10). A new protocol to characterize polymeric nanoparticles utilizing nanoparticle tracking analysis (NTA) is described. In this approach, both the particle size distribution and the distribution of the number of plasmids per particle are obtained(11). In addition, a high-throughput 96-well plate transfection assay for rapid screening of the transfection efficacy of polymeric nanoparticles is presented. In this protocol, poly(beta-amino ester)s (PBAEs) are used as model polymers and human retinal endothelial cells (HRECs) are used as model human cells. This protocol can be easily adapted to evaluate any polymeric nanoparticle and any cell type of interest in a multi

  7. Identification of microRNAs from Eugenia uniflora by high-throughput sequencing and bioinformatics analysis.

    Directory of Open Access Journals (Sweden)

    Frank Guzman

    Full Text Available BACKGROUND: microRNAs or miRNAs are small non-coding regulatory RNAs that play important functions in the regulation of gene expression at the post-transcriptional level by targeting mRNAs for degradation or inhibiting protein translation. Eugenia uniflora is a plant native to tropical America with pharmacological and ecological importance, and there have been no previous studies concerning its gene expression and regulation. To date, no miRNAs have been reported in Myrtaceae species. RESULTS: Small RNA and RNA-seq libraries were constructed to identify miRNAs and pre-miRNAs in Eugenia uniflora. Solexa technology was used to perform high throughput sequencing of the library, and the data obtained were analyzed using bioinformatics tools. From 14,489,131 small RNA clean reads, we obtained 1,852,722 mature miRNA sequences representing 45 conserved families that have been identified in other plant species. Further analysis using contigs assembled from RNA-seq allowed the prediction of secondary structures of 25 known and 17 novel pre-miRNAs. The expression of twenty-seven identified miRNAs was also validated using RT-PCR assays. Potential targets were predicted for the most abundant mature miRNAs in the identified pre-miRNAs based on sequence homology. CONCLUSIONS: This study is the first large scale identification of miRNAs and their potential targets from a species of the Myrtaceae family without genomic sequence resources. Our study provides more information about the evolutionary conservation of the regulatory network of miRNAs in plants and highlights species-specific miRNAs.

  8. Utility of lab-on-a-chip technology for high-throughput nucleic acid and protein analysis

    DEFF Research Database (Denmark)

    Hawtin, Paul; Hardern, Ian; Wittig, Rainer

    2005-01-01

    On-chip electrophoresis can provide size separations of nucleic acids and proteins similar to more traditional slab gel electrophoresis. Lab-on-a-chip (LoaC) systems utilize on-chip electrophoresis in conjunction with sizing calibration, sensitive detection schemes, and sophisticated data analysis...... to achieve rapid analysis times (C systems to enable and augment systems biology investigations. RNA quality, as assessed by an RNA integrity number score, is compared to existing quality control (QC) measurements. High-throughput DNA analysis of multiplex PCR...... samples is used to stratify gene sets for disease discovery. Finally, the applicability of a high-throughput LoaC system for assessing protein purification is demonstrated. The improvements in workflow processes, speed of analysis, data accuracy and reproducibility, and automated data analysis...

  9. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  10. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    Directory of Open Access Journals (Sweden)

    Husted Søren

    2009-09-01

    Full Text Available Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. Results We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight. A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds, the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm closely matched the total content obtained by analysis of the whole rice grain. Conclusion A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at

  11. High-throughput genome editing and phenotyping facilitated by high resolution melting curve analysis.

    Directory of Open Access Journals (Sweden)

    Holly R Thomas

    facilitate future high-throughput mutation generation and analysis needed to establish mutants in all genes of an organism.

  12. Comparative study of machine-learning and chemometric tools for analysis of in-vivo high-throughput screening data.

    Science.gov (United States)

    Simmons, Kirk; Kinney, John; Owens, Aaron; Kleier, Dan; Bloch, Karen; Argentar, Dave; Walsh, Alicia; Vaidyanathan, Ganesh

    2008-08-01

    High-throughput screening (HTS) has become a central tool of many pharmaceutical and crop-protection discovery operations. If HTS screening is carried out at the level of the intact organism, as is commonly done in crop protection, this strategy has the potential of uncovering a completely new mechanism of actions. The challenge in running a cost-effective HTS operation is to identify ways in which to improve the overall success rate in discovering new biologically active compounds. To this end, we describe our efforts directed at making full use of the data stream arising from HTS. This paper describes a comparative study in which several machine learning and chemometric methodologies were used to develop classifiers on the same data sets derived from in vivo HTS campaigns and their predictive performances compared in terms of false negative and false positive error profiles.

  13. High-throughput proteomic analysis of human infiltrating ductal carcinoma of the breast.

    Science.gov (United States)

    Somiari, Richard I; Sullivan, Anthony; Russell, Stephen; Somiari, Stella; Hu, Hai; Jordan, Rick; George, Alisha; Katenhusen, Richard; Buchowiecka, Alicja; Arciero, Cletus; Brzeski, Henry; Hooke, Jeff; Shriver, Craig

    2003-10-01

    Large-scale proteomics will play a critical role in the rapid display, identification and validation of new protein targets, and elucidation of the underlying molecular events that are associated with disease development, progression and severity. However, because the proteome of most organisms are significantly more complex than the genome, the comprehensive analysis of protein expression changes will require an analytical effort beyond the capacity of standard laboratory equipment. We describe the first high-throughput proteomic analysis of human breast infiltrating ductal carcinoma (IDCA) using OCT (optimal cutting temperature) embedded biopsies, two-dimensional difference gel electrophoresis (2-D DIGE) technology and a fully automated spot handling workstation. Total proteins from four breast IDCAs (Stage I, IIA, IIB and IIIA) were individually compared to protein from non-neoplastic tissue obtained from a female donor with no personal or family history of breast cancer. We detected differences in protein abundance that ranged from 14.8% in stage I IDCA versus normal, to 30.6% in stage IIB IDCA versus normal. A total of 524 proteins that showed > or = three-fold difference in abundance between IDCA and normal tissue were picked, processed and identified by mass spectrometry. Out of the proteins picked, approximately 80% were unambiguously assigned identities by matrix-assisted laser desorbtion/ionization-time of flight mass spectrometry or liquid chromatography-tandem mass spectrometry in the first pass. Bioinformatics tools were also used to mine databases to determine if the identified proteins are involved in important pathways and/or interact with other proteins. Gelsolin, vinculin, lumican, alpha-1-antitrypsin, heat shock protein-60, cytokeratin-18, transferrin, enolase-1 and beta-actin, showed differential abundance between IDCA and normal tissue, but the trend was not consistent in all samples. Out of the proteins with database hits, only heat shock

  14. Microfluidic Plastic Devices for Single-use Applications in High-Throughput Screening and DNA-Analysis

    OpenAIRE

    Gerlach, Andreas; Knebel, Günther; Guber, A. E.; Heckele, M.; Herrmann, D; Muslija, A.; Schaller, T.

    2001-01-01

    Microfluidic devices fabricated by mass production offer an immense potential of applications such as high-throughput drug screening, clinical diagnostics and gene analysis [1]. The low unit production costs of plastic substrates make it possible to produce single-use devices, eliminating the need for cleaning and reuse [2]. Fabrication of microfluidic devices can be applied by microtechnical fabrication processes in combination with plastic molding techniques [3]. Basically, replication...

  15. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States); Gagnon, David [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Gjoerup, Ole [Molecular Oncology Research Institute, Tufts Medical Center, Boston, MA 02111 (United States); Archambault, Jacques [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Bullock, Peter A., E-mail: Peter.Bullock@tufts.edu [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States)

    2014-11-15

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication.

  16. web cellHTS2: a web-application for the analysis of high-throughput screening data.

    Science.gov (United States)

    Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael

    2010-04-12

    The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  17. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping1[C][W][OPEN

    Science.gov (United States)

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-01-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818

  18. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  19. Investigation of ternary subsystems of superalloys by thin-film combinatorial synthesis and high-throughput analysis

    Directory of Open Access Journals (Sweden)

    König Dennis

    2014-01-01

    Full Text Available A Co-Ti-W thin film materials library was fabricated by magnetron sputtering. By using automated high-throughput measurement techniques (resistance mapping, automated XRD measurements and cluster analysis a yet unknown phase region was revealed. The existence region of the new ternary phase is close to the composition Co60Ti15W25. In order to transfer the results from thin film to bulk material, a bulk sample was prepared by arc melting and subsequent heat treatment. Scanning electron microscopy and chemical micro-analysis data support that a yet unknown ternary phase exists in the system Co-Ti-W.

  20. The simple fool's guide to population genomics via RNA-Seq: An introduction to high-throughput sequencing data analysis

    DEFF Research Database (Denmark)

    De Wit, P.; Pespeni, M.H.; Ladner, J.T.;

    2012-01-01

    to Population Genomics via RNA-seq' (SFG), a document intended to serve as an easy-to-follow protocol, walking a user through one example of high-throughput sequencing data analysis of nonmodel organisms. It is by no means an exhaustive protocol, but rather serves as an introduction to the bioinformatic methods......://sfg.stanford.edu, that includes detailed protocols for data processing and analysis, along with a repository of custom-made scripts and sample files. Steps included in the SFG range from tissue collection to de novo assembly, blast annotation, alignment, gene expression, functional enrichment, SNP detection, principal components...

  1. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  2. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  3. Analysis of RNA folding and ligand binding by conventional and high-throughput calorimetry.

    Science.gov (United States)

    Sokoloski, Joshua E; Bevilacqua, Philip C

    2012-01-01

    Noncoding RNAs serve myriad functions in the cell, but their biophysical properties are not well understood. Calorimetry offers direct and label-free means for characterizing the ligand-binding and thermostability properties of these RNA. We apply two main types of calorimetry--isothermal titration calorimetry (ITC) and differential scanning calorimetry (DSC)--to the characterization of these functional RNA molecules. ITC can describe ligand binding in terms of stoichiometry, affinity, and heat (enthalpy), while DSC can provide RNA stability in terms of heat capacity, melting temperature, and folding enthalpy. Here, we offer detailed experimental protocols for studying such RNA systems with commercially available conventional and high-throughput ITC and DSC instruments.

  4. Optimized methods for high-throughput analysis of hair samples for American black bears (Ursus americanus

    Directory of Open Access Journals (Sweden)

    Thea V Kristensen

    2011-06-01

    Full Text Available Noninvasive sampling has revolutionized the study of species that are difficult or dangerous to study using traditional methods. Early studies were often confined to small populations as genotyping large numbers of samples was prohibitively costly and labor intensive. Here we describe optimized protocols designed to reduce the costs and effort required for microsatellite genotyping and sex determination for American black bears (Ursus americanus. We redesigned primers for six microsatellite loci, designed novel primers for the amelogenin gene for genetic determination of sex, and optimized conditions for a nine-locus multiplex PCR. Our high-throughput methods will enable researchers to include larger sample sizes in studies of black bears, providing data in a timely fashion that can be used to inform population management.

  5. A High-Throughput Method for the Analysis of Larval Developmental Phenotypes in Caenorhabditis elegans.

    Science.gov (United States)

    Olmedo, María; Geibel, Mirjam; Artal-Sanz, Marta; Merrow, Martha

    2015-10-01

    Caenorhabditis elegans postembryonic development consists of four discrete larval stages separated by molts. Typically, the speed of progression through these larval stages is investigated by visual inspection of the molting process. Here, we describe an automated method to monitor the timing of these discrete phases of C. elegans maturation, from the first larval stage through adulthood, using bioluminescence. The method was validated with a lin-42 mutant strain that shows delayed development relative to wild-type animals and with a daf-2 mutant that shows an extended second larval stage. This new method is inherently high-throughput and will finally allow dissecting the molecular machinery governing the speed of the developmental clock, which has so far been hampered by the lack of a method suitable for genetic screens.

  6. Analysis of JC virus DNA replication using a quantitative and high-throughput assay.

    Science.gov (United States)

    Shin, Jong; Phelan, Paul J; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A

    2014-11-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication.

  7. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Craig A Gedye

    Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell

  8. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  9. Rapid High-throughput Species Identification of Botanical Material Using Direct Analysis in Real Time High Resolution Mass Spectrometry.

    Science.gov (United States)

    Lesiak, Ashton D; Musah, Rabi A

    2016-10-02

    We demonstrate that direct analysis in real time-high resolution mass spectrometry can be used to produce mass spectral profiles of botanical material, and that these chemical fingerprints can be used for plant species identification. The mass spectral data can be acquired rapidly and in a high throughput manner without the need for sample extraction, derivatization or pH adjustment steps. The use of this technique bypasses challenges presented by more conventional techniques including lengthy chromatography analysis times and resource intensive methods. The high throughput capabilities of the direct analysis in real time-high resolution mass spectrometry protocol, coupled with multivariate statistical analysis processing of the data, provide not only class characterization of plants, but also yield species and varietal information. Here, the technique is demonstrated with two psychoactive plant products, Mitragyna speciosa (Kratom) and Datura (Jimsonweed), which were subjected to direct analysis in real time-high resolution mass spectrometry followed by statistical analysis processing of the mass spectral data. The application of these tools in tandem enabled the plant materials to be rapidly identified at the level of variety and species.

  10. Rapid High-throughput Species Identification of Botanical Material Using Direct Analysis in Real Time High Resolution Mass Spectrometry

    Science.gov (United States)

    Lesiak, Ashton D.; Musah, Rabi A.

    2016-01-01

    We demonstrate that direct analysis in real time-high resolution mass spectrometry can be used to produce mass spectral profiles of botanical material, and that these chemical fingerprints can be used for plant species identification. The mass spectral data can be acquired rapidly and in a high throughput manner without the need for sample extraction, derivatization or pH adjustment steps. The use of this technique bypasses challenges presented by more conventional techniques including lengthy chromatography analysis times and resource intensive methods. The high throughput capabilities of the direct analysis in real time-high resolution mass spectrometry protocol, coupled with multivariate statistical analysis processing of the data, provide not only class characterization of plants, but also yield species and varietal information. Here, the technique is demonstrated with two psychoactive plant products, Mitragyna speciosa (Kratom) and Datura (Jimsonweed), which were subjected to direct analysis in real time-high resolution mass spectrometry followed by statistical analysis processing of the mass spectral data. The application of these tools in tandem enabled the plant materials to be rapidly identified at the level of variety and species. PMID:27768072

  11. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    Science.gov (United States)

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  12. Illuminating plant biology: using fluorescent proteins for high-throughput analysis of protein localization and function in plants.

    Science.gov (United States)

    DeBlasio, Stacy L; Sylvester, Anne W; Jackson, David

    2010-03-01

    First discovered in jellyfish, fluorescent proteins (FPs) have been successfully optimized for use as effective biomarkers within living plant cells. When exposed to light, FPs fused to a protein or regulatory element will fluoresce, and non-invasively mark expression and protein localization, which allows for the in vivo monitoring of diverse cellular processes. In this review, we discuss how FP technology has evolved from small-scale analysis of individual genes to more high-throughput techniques for global expression and functional profiling in plants.

  13. High-Throughput Tissue Bioenergetics Analysis Reveals Identical Metabolic Allometric Scaling for Teleost Hearts and Whole Organisms.

    Science.gov (United States)

    Jayasundara, Nishad; Kozal, Jordan S; Arnold, Mariah C; Chan, Sherine S L; Di Giulio, Richard T

    2015-01-01

    Organismal metabolic rate, a fundamental metric in biology, demonstrates an allometric scaling relationship with body size. Fractal-like vascular distribution networks of biological systems are proposed to underlie metabolic rate allometric scaling laws from individual organisms to cells, mitochondria, and enzymes. Tissue-specific metabolic scaling is notably absent from this paradigm. In the current study, metabolic scaling relationships of hearts and brains with body size were examined by improving on a high-throughput whole-organ oxygen consumption rate (OCR) analysis method in five biomedically and environmentally relevant teleost model species. Tissue-specific metabolic scaling was compared with organismal routine metabolism (RMO2), which was measured using whole organismal respirometry. Basal heart OCR and organismal RMO2 scaled identically with body mass in a species-specific fashion across all five species tested. However, organismal maximum metabolic rates (MMO2) and pharmacologically-induced maximum cardiac metabolic rates in zebrafish Danio rerio did not show a similar relationship with body mass. Brain metabolic rates did not scale with body size. The identical allometric scaling of heart and organismal metabolic rates with body size suggests that hearts, the power generator of an organism's vascular distribution network, might be crucial in determining teleost metabolic rate scaling under routine conditions. Furthermore, these findings indicate the possibility of measuring heart OCR utilizing the high-throughput approach presented here as a proxy for organismal metabolic rate-a useful metric in characterizing organismal fitness. In addition to heart and brain OCR, the current approach was also used to measure whole liver OCR, partition cardiac mitochondrial bioenergetic parameters using pharmacological agents, and estimate heart and brain glycolytic rates. This high-throughput whole-organ bioenergetic analysis method has important applications in

  14. High-Throughput Tissue Bioenergetics Analysis Reveals Identical Metabolic Allometric Scaling for Teleost Hearts and Whole Organisms.

    Directory of Open Access Journals (Sweden)

    Nishad Jayasundara

    Full Text Available Organismal metabolic rate, a fundamental metric in biology, demonstrates an allometric scaling relationship with body size. Fractal-like vascular distribution networks of biological systems are proposed to underlie metabolic rate allometric scaling laws from individual organisms to cells, mitochondria, and enzymes. Tissue-specific metabolic scaling is notably absent from this paradigm. In the current study, metabolic scaling relationships of hearts and brains with body size were examined by improving on a high-throughput whole-organ oxygen consumption rate (OCR analysis method in five biomedically and environmentally relevant teleost model species. Tissue-specific metabolic scaling was compared with organismal routine metabolism (RMO2, which was measured using whole organismal respirometry. Basal heart OCR and organismal RMO2 scaled identically with body mass in a species-specific fashion across all five species tested. However, organismal maximum metabolic rates (MMO2 and pharmacologically-induced maximum cardiac metabolic rates in zebrafish Danio rerio did not show a similar relationship with body mass. Brain metabolic rates did not scale with body size. The identical allometric scaling of heart and organismal metabolic rates with body size suggests that hearts, the power generator of an organism's vascular distribution network, might be crucial in determining teleost metabolic rate scaling under routine conditions. Furthermore, these findings indicate the possibility of measuring heart OCR utilizing the high-throughput approach presented here as a proxy for organismal metabolic rate-a useful metric in characterizing organismal fitness. In addition to heart and brain OCR, the current approach was also used to measure whole liver OCR, partition cardiac mitochondrial bioenergetic parameters using pharmacological agents, and estimate heart and brain glycolytic rates. This high-throughput whole-organ bioenergetic analysis method has important

  15. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  16. Information content and analysis methods for multi-modal high-throughput biomedical data.

    Science.gov (United States)

    Ray, Bisakha; Henaff, Mikael; Ma, Sisi; Efstathiadis, Efstratios; Peskin, Eric R; Picone, Marco; Poli, Tito; Aliferis, Constantin F; Statnikov, Alexander

    2014-03-21

    The spectrum of modern molecular high-throughput assaying includes diverse technologies such as microarray gene expression, miRNA expression, proteomics, DNA methylation, among many others. Now that these technologies have matured and become increasingly accessible, the next frontier is to collect "multi-modal" data for the same set of subjects and conduct integrative, multi-level analyses. While multi-modal data does contain distinct biological information that can be useful for answering complex biology questions, its value for predicting clinical phenotypes and contributions of each type of input remain unknown. We obtained 47 datasets/predictive tasks that in total span over 9 data modalities and executed analytic experiments for predicting various clinical phenotypes and outcomes. First, we analyzed each modality separately using uni-modal approaches based on several state-of-the-art supervised classification and feature selection methods. Then, we applied integrative multi-modal classification techniques. We have found that gene expression is the most predictively informative modality. Other modalities such as protein expression, miRNA expression, and DNA methylation also provide highly predictive results, which are often statistically comparable but not superior to gene expression data. Integrative multi-modal analyses generally do not increase predictive signal compared to gene expression data.

  17. High-throughput nucleotide sequence analysis of diverse bacterial communities in leachates of decomposing pig carcasses

    Directory of Open Access Journals (Sweden)

    Seung Hak Yang

    2015-09-01

    Full Text Available The leachate generated by the decomposition of animal carcass has been implicated as an environmental contaminant surrounding the burial site. High-throughput nucleotide sequencing was conducted to investigate the bacterial communities in leachates from the decomposition of pig carcasses. We acquired 51,230 reads from six different samples (1, 2, 3, 4, 6 and 14 week-old carcasses and found that sequences representing the phylum Firmicutes predominated. The diversity of bacterial 16S rRNA gene sequences in the leachate was the highest at 6 weeks, in contrast to those at 2 and 14 weeks. The relative abundance of Firmicutes was reduced, while the proportion of Bacteroidetes and Proteobacteria increased from 3–6 weeks. The representation of phyla was restored after 14 weeks. However, the community structures between the samples taken at 1–2 and 14 weeks differed at the bacterial classification level. The trend in pH was similar to the changes seen in bacterial communities, indicating that the pH of the leachate could be related to the shift in the microbial community. The results indicate that the composition of bacterial communities in leachates of decomposing pig carcasses shifted continuously during the study period and might be influenced by the burial site.

  18. High-throughput sorting and analysis of human sperm with a ring-shaped laser trap.

    Science.gov (United States)

    Shao, Bing; Shi, Linda Z; Nascimento, Jaclyn M; Botvinick, Elliot L; Ozkan, Mihrimah; Berns, Michael W; Esener, Sadik C

    2007-06-01

    Sperm motility is an important concept in fertility research. To this end, single spot laser tweezers have been used to quantitatively analyze the motility of individual sperm. However, this method is limited with throughput (single sperm per spot), lacks the ability of in-situ sorting based on motility and chemotaxis, requires high laser power (hundreds of milliWatts) and can not be used to dynamically monitor changes in sperm swimming behavior under the influence of a laser beam. Here, we report a continuous 3-D ring-shaped laser trap which could be used for multi-level and high-throughput (tens to hundred sperm per ring) sperm sorting based on their motility and chemotaxis. Under a laser power of only tens of milliWatts, human sperm with low to medium velocity are slowed down, stopped, or forced to change their trajectories to swim along the ring due to the optical gradient force in the radial direction. This is the first demonstration of parallel sperm sorting based on motility with optical trapping technology. In addition, by making the sperm swimming along the circumference of the ring, the effect of laser radiation, optical force and external obstacles on sperm energetics are investigated in a more gentle and quantitative way. The application of this method could be extended to motility and bio-tropism studies of other self-propelled cells, such as algae and bacteria.

  19. FLIC: high-throughput, continuous analysis of feeding behaviors in Drosophila.

    Directory of Open Access Journals (Sweden)

    Jennifer Ro

    Full Text Available We present a complete hardware and software system for collecting and quantifying continuous measures of feeding behaviors in the fruit fly, Drosophila melanogaster. The FLIC (Fly Liquid-Food Interaction Counter detects analog electronic signals as brief as 50 µs that occur when a fly makes physical contact with liquid food. Signal characteristics effectively distinguish between different types of behaviors, such as feeding and tasting events. The FLIC system performs as well or better than popular methods for simple assays, and it provides an unprecedented opportunity to study novel components of feeding behavior, such as time-dependent changes in food preference and individual levels of motivation and hunger. Furthermore, FLIC experiments can persist indefinitely without disturbance, and we highlight this ability by establishing a detailed picture of circadian feeding behaviors in the fly. We believe that the FLIC system will work hand-in-hand with modern molecular techniques to facilitate mechanistic studies of feeding behaviors in Drosophila using modern, high-throughput technologies.

  20. FLIC: high-throughput, continuous analysis of feeding behaviors in Drosophila.

    Science.gov (United States)

    Ro, Jennifer; Harvanek, Zachary M; Pletcher, Scott D

    2014-01-01

    We present a complete hardware and software system for collecting and quantifying continuous measures of feeding behaviors in the fruit fly, Drosophila melanogaster. The FLIC (Fly Liquid-Food Interaction Counter) detects analog electronic signals as brief as 50 µs that occur when a fly makes physical contact with liquid food. Signal characteristics effectively distinguish between different types of behaviors, such as feeding and tasting events. The FLIC system performs as well or better than popular methods for simple assays, and it provides an unprecedented opportunity to study novel components of feeding behavior, such as time-dependent changes in food preference and individual levels of motivation and hunger. Furthermore, FLIC experiments can persist indefinitely without disturbance, and we highlight this ability by establishing a detailed picture of circadian feeding behaviors in the fly. We believe that the FLIC system will work hand-in-hand with modern molecular techniques to facilitate mechanistic studies of feeding behaviors in Drosophila using modern, high-throughput technologies.

  1. High-throughput kinetic study of hydrogenation over palladium nanoparticles: combination of reaction and analysis.

    Science.gov (United States)

    Trapp, Oliver; Weber, Sven K; Bauch, Sabrina; Bäcker, Tobias; Hofstadt, Werner; Spliethoff, Bernd

    2008-01-01

    The hydrogenation of 1-acetylcyclohexene, cyclohex-2-enone, nitrobenzene, and trans-methylpent-3-enoate catalyzed by highly active palladium nanoparticles was studied by high-throughput on-column reaction gas chromatography. In these experiments, catalysis and separation of educts and products is integrated by the use of a catalytically active gas chromatographic stationary phase, which allows reaction rate measurements to be efficiently performed by employing reactant libraries. Palladium nanoparticles embedded in a stabilizing polysiloxane matrix serve as catalyst and selective chromatographic stationary phase for these multiphase reactions (gas-liquid-solid) and are coated in fused-silica capillaries (inner diameter 250 microm) as a thin film of thickness 250 nm. The palladium nanoparticles were prepared by reduction of palladium acetate with hydridomethylsiloxane-dimethylsiloxane copolymer and self-catalyzed hydrosilylation with methylvinylsiloxane-dimethylsiloxane copolymer to obtain a stabilizing matrix. Diphenylsiloxane-dimethylsiloxane copolymer (GE SE 52) was added to improve film stability over a wide range of compositions. Herein, we show by systematic TEM investigations that the size and morphology (crystalline or amorphous) of the nanoparticles strongly depends on the ratio of the stabilizing polysiloxanes, the conditions to immobilize the stationary phase on the surface of the fused-silica capillary, and the loading of the palladium precursor. Furthermore, hydrogenations were performed with these catalytically active stationary phases between 60 and 100 degrees C at various contact times to determine the temperature-dependent reaction rate constants and to obtain activation parameters and diffusion coefficients.

  2. On Efficient Feature Ranking Methods for High-Throughput Data Analysis.

    Science.gov (United States)

    Liao, Bo; Jiang, Yan; Liang, Wei; Peng, Lihong; Peng, Li; Hanyurwimfura, Damien; Li, Zejun; Chen, Min

    2015-01-01

    Efficient mining of high-throughput data has become one of the popular themes in the big data era. Existing biology-related feature ranking methods mainly focus on statistical and annotation information. In this study, two efficient feature ranking methods are presented. Multi-target regression and graph embedding are incorporated in an optimization framework, and feature ranking is achieved by introducing structured sparsity norm. Unlike existing methods, the presented methods have two advantages: (1) the feature subset simultaneously account for global margin information as well as locality manifold information. Consequently, both global and locality information are considered. (2) Features are selected by batch rather than individually in the algorithm framework. Thus, the interactions between features are considered and the optimal feature subset can be guaranteed. In addition, this study presents a theoretical justification. Empirical experiments demonstrate the effectiveness and efficiency of the two algorithms in comparison with some state-of-the-art feature ranking methods through a set of real-world gene expression data sets.

  3. Perchlorate reduction by hydrogen autotrophic bacteria and microbial community analysis using high-throughput sequencing.

    Science.gov (United States)

    Wan, Dongjin; Liu, Yongde; Niu, Zhenhua; Xiao, Shuhu; Li, Daorong

    2016-02-01

    Hydrogen autotrophic reduction of perchlorate have advantages of high removal efficiency and harmless to drinking water. But so far the reported information about the microbial community structure was comparatively limited, changes in the biodiversity and the dominant bacteria during acclimation process required detailed study. In this study, perchlorate-reducing hydrogen autotrophic bacteria were acclimated by hydrogen aeration from activated sludge. For the first time, high-throughput sequencing was applied to analyze changes in biodiversity and the dominant bacteria during acclimation process. The Michaelis-Menten model described the perchlorate reduction kinetics well. Model parameters q(max) and K(s) were 2.521-3.245 (mg ClO4(-)/gVSS h) and 5.44-8.23 (mg/l), respectively. Microbial perchlorate reduction occurred across at pH range 5.0-11.0; removal was highest at pH 9.0. The enriched mixed bacteria could use perchlorate, nitrate and sulfate as electron accepter, and the sequence of preference was: NO3(-) > ClO4(-) > SO4(2-). Compared to the feed culture, biodiversity decreased greatly during acclimation process, the microbial community structure gradually stabilized after 9 acclimation cycles. The Thauera genus related to Rhodocyclales was the dominated perchlorate reducing bacteria (PRB) in the mixed culture.

  4. Comprehensive analysis of high-throughput screens with HiTSeekR

    DEFF Research Database (Denmark)

    List, Markus; Schmidt, Steffen; Christiansen, Helle

    2016-01-01

    High-throughput screening (HTS) is an indispensable tool for drug (target) discovery that currently lacks user-friendly software tools for the robust identification of putative hits from HTS experiments and for the interpretation of these findings in the context of systems biology. We developed Hi......TSeekR as a one-stop solution for chemical compound screens, siRNA knock-down and CRISPR/Cas9 knock-out screens, as well as microRNA inhibitor and -mimics screens. We chose three use cases that demonstrate the potential of HiTSeekR to fully exploit HTS screening data in quite heterogeneous contexts to generate...... novel hypotheses for follow-up experiments: (i) a genome-wide RNAi screen to uncover modulators of TNFα, (ii) a combined siRNA and miRNA mimics screen on vorinostat resistance and (iii) a small compound screen on KRAS synthetic lethality. HiTSeekR is publicly available at http...

  5. High-throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in OFETs.

    Science.gov (United States)

    Persson, Nils; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha A; Reichmanis, Elsa

    2017-09-27

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. The is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  6. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  7. High-throughput mutational analysis of TOR1A in primary dystonia

    Directory of Open Access Journals (Sweden)

    Truong Daniel D

    2009-03-01

    Full Text Available Abstract Background Although the c.904_906delGAG mutation in Exon 5 of TOR1A typically manifests as early-onset generalized dystonia, DYT1 dystonia is genetically and clinically heterogeneous. Recently, another Exon 5 mutation (c.863G>A has been associated with early-onset generalized dystonia and some ΔGAG mutation carriers present with late-onset focal dystonia. The aim of this study was to identify TOR1A Exon 5 mutations in a large cohort of subjects with mainly non-generalized primary dystonia. Methods High resolution melting (HRM was used to examine the entire TOR1A Exon 5 coding sequence in 1014 subjects with primary dystonia (422 spasmodic dysphonia, 285 cervical dystonia, 67 blepharospasm, 41 writer's cramp, 16 oromandibular dystonia, 38 other primary focal dystonia, 112 segmental dystonia, 16 multifocal dystonia, and 17 generalized dystonia and 250 controls (150 neurologically normal and 100 with other movement disorders. Diagnostic sensitivity and specificity were evaluated in an additional 8 subjects with known ΔGAG DYT1 dystonia and 88 subjects with ΔGAG-negative dystonia. Results HRM of TOR1A Exon 5 showed high (100% diagnostic sensitivity and specificity. HRM was rapid and economical. HRM reliably differentiated the TOR1A ΔGAG and c.863G>A mutations. Melting curves were normal in 250/250 controls and 1012/1014 subjects with primary dystonia. The two subjects with shifted melting curves were found to harbor the classic ΔGAG deletion: 1 a non-Jewish Caucasian female with childhood-onset multifocal dystonia and 2 an Ashkenazi Jewish female with adolescent-onset spasmodic dysphonia. Conclusion First, HRM is an inexpensive, diagnostically sensitive and specific, high-throughput method for mutation discovery. Second, Exon 5 mutations in TOR1A are rarely associated with non-generalized primary dystonia.

  8. Transcriptomic analysis of Petunia hybrida in response to salt stress using high throughput RNA sequencing.

    Directory of Open Access Journals (Sweden)

    Gonzalo H Villarino

    Full Text Available Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments.

  9. Transcriptomic Analysis of Petunia hybrida in Response to Salt Stress Using High Throughput RNA Sequencing

    Science.gov (United States)

    Villarino, Gonzalo H.; Bombarely, Aureliano; Giovannoni, James J.; Scanlon, Michael J.; Mattson, Neil S.

    2014-01-01

    Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl) disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN) http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments. PMID:24722556

  10. Bacterial pathogens and community composition in advanced sewage treatment systems revealed by metagenomics analysis based on high-throughput sequencing.

    Science.gov (United States)

    Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying

    2015-01-01

    This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence.

  11. Bacterial pathogens and community composition in advanced sewage treatment systems revealed by metagenomics analysis based on high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Xin Lu

    Full Text Available This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence.

  12. High throughput drug profiling

    OpenAIRE

    Entzeroth, Michael; Chapelain, Béatrice; Guilbert, Jacques; Hamon, Valérie

    2000-01-01

    High throughput screening has significantly contributed to advances in drug discovery. The great increase in the number of samples screened has been accompanied by increases in costs and in the data required for the investigated compounds. High throughput profiling addresses the issues of compound selectivity and specificity. It combines conventional screening with data mining technologies to give a full set of data, enabling development candidates to be more fully compared.

  13. High-throughput dynamic analysis of differentially expressed genes in splenic dendritic cells from mice infected with Schistosoma japonicum.

    Science.gov (United States)

    Chen, Lin; Chen, Qingzhou; Hou, Wei; He, Li

    2017-04-01

    Dendritic cells are the initiation and key point of immune response and play a role in immune regulation. So we explored the mechanisms involved in immune regulation of dendritic cells (DCs) against schistosomiasis using mice infected with Schistosoma japonicum. Splenic DCs from normal mice and mice with acute and chronic S. japonicum infection were sorted by flow cytometry. The numbers and functions of differentially expressed genes (DEGs) in DCs were determined by high-throughput analysis. All DEGs with transcription-level fold changes of ≥2 were selected and matched to corresponding genes in databases. Annotations and cluster analysis of DEGs were performed to compare differences between groups. Six important DEGs about immune regulation-CD86, TLR2, DC-SIGN, Capase3, PD-L2, and IL-7r were selected, and their transcription levels at different stages of schistosomisis were validated by qPCR. The Venn diagram of DEGs implied some genes are functional at all stages during S. japonicum infection, while others are only involved at certain stages. GO and KEGG pathway annotations indicated that these DEGs mainly belong to biological regulation, regulation of biological process, regulation of cellular process, antigen processing and presentation, cell adhesion molecules, cytokine-cytokine receptor interaction and Toll-like receptor signaling. Cluster analysis revealed immune regulation existed in splenic DCs. The results above indicated that the mechanisms underlying immune regulation to S. japonicum infection in mice are very complex. The present high-throughput dynamic analysis of DEGs in splenic DCs provides valuable insights into the molecular mechanisms underlying immune regulation in S. japonicum infection. Copyright © 2017 European Federation of Immunological Societies. Published by Elsevier B.V. All rights reserved.

  14. Cost-effectiveness analysis of Mectizan treatment Programmes for ...

    African Journals Online (AJOL)

    Cost-effectiveness analysis of Mectizan treatment Programmes for Onchocerciasis Control: Operational Experiences in two districts of Southwestern Nigeria. ... Vol 8, No 1 (2009) >. Log in or Register to get access to full text downloads.

  15. High throughput LC-MS/MS method for the simultaneous analysis of multiple vitamin D analytes in serum.

    Science.gov (United States)

    Jenkinson, Carl; Taylor, Angela E; Hassan-Smith, Zaki K; Adams, John S; Stewart, Paul M; Hewison, Martin; Keevil, Brian G

    2016-03-01

    Recent studies suggest that vitamin D-deficiency is linked to increased risk of common human health problems. To define vitamin D 'status' most routine analytical methods quantify one particular vitamin D metabolite, 25-hydroxyvitamin D3 (25OHD3). However, vitamin D is characterized by complex metabolic pathways, and simultaneous measurement of multiple vitamin D metabolites may provide a more accurate interpretation of vitamin D status. To address this we developed a high-throughput liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to analyse multiple vitamin D analytes, with particular emphasis on the separation of epimer metabolites. A supportive liquid-liquid extraction (SLE) and LC-MS/MS method was developed to quantify 10 vitamin D metabolites as well as separation of an interfering 7α-hydroxy-4-cholesten-3-one (7αC4) isobar (precursor of bile acid), and validated by analysis of human serum samples. In a cohort of 116 healthy subjects, circulating concentrations of 25-hydroxyvitamin D3 (25OHD3), 3-epi-25-hydroxyvitamin D3 (3-epi-25OHD3), 24,25-dihydroxyvitamin D3 (24R,25(OH)2D3), 1,25-dihydroxyvitamin D3 (1α,25(OH)2D3), and 25-hydroxyvitamin D2 (25OHD2) were quantifiable using 220μL of serum, with 25OHD3 and 24R,25(OH)2D3 showing significant seasonal variations. This high-throughput LC-MS/MS method provides a novel strategy for assessing the impact of vitamin D on human health and disease.

  16. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  17. High-throughput thermal stability analysis of a monoclonal antibody by attenuated total reflection FT-IR spectroscopic imaging.

    Science.gov (United States)

    Boulet-Audet, Maxime; Byrne, Bernadette; Kazarian, Sergei G

    2014-10-07

    The use of biotherapeutics, such as monoclonal antibodies, has markedly increased in recent years. It is thus essential that biotherapeutic production pipelines are as efficient as possible. For the production process, one of the major concerns is the propensity of a biotherapeutic antibody to aggregate. In addition to reducing bioactive material recovery, protein aggregation can have major effects on drug potency and cause highly undesirable immunological effects. It is thus essential to identify processing conditions which maximize recovery while avoiding aggregation. Heat resistance is a proxy for long-term aggregation propensity. Thermal stability assays are routinely performed using various spectroscopic and scattering detection methods. Here, we evaluated the potential of macro attenuated total reflection Fourier transform infrared (ATR-FT-IR) spectroscopic imaging as a novel method for the high-throughput thermal stability assay of a monoclonal antibody. This chemically specific visualization method has the distinct advantage of being able to discriminate between monomeric and aggregated protein. Attenuated total reflection is particularly suitable for selectively probing the bottom of vessels, where precipitated aggregates accumulate. With focal plane array detection, we tested 12 different buffer conditions simultaneously to assess the effect of pH and ionic strength on protein thermal stability. Applying the Finke model to our imaging kinetics allowed us to determine the rate constants of nucleation and autocatalytic growth. This analysis demonstrated the greater stability of our immunoglobulin at higher pH and moderate ionic strength, revealing the key role of electrostatic interactions. The high-throughput approach presented here has significant potential for analyzing the stability of biotherapeutics as well as any other biological molecules prone to aggregation.

  18. High-throughput avian molecular sexing by SYBR green-based real-time PCR combined with melting curve analysis

    Directory of Open Access Journals (Sweden)

    Chou Yii-Cheng

    2008-02-01

    Full Text Available Abstract Background Combination of CHD (chromo-helicase-DNA binding protein-specific polymerase chain reaction (PCR with electrophoresis (PCR/electrophoresis is the most common avian molecular sexing technique but it is lab-intensive and gel-required. Gender determination often fails when the difference in length between the PCR products of CHD-Z and CHD-W genes is too short to be resolved. Results Here, we are the first to introduce a PCR-melting curve analysis (PCR/MCA to identify the gender of birds by genomic DNA, which is gel-free, quick, and inexpensive. Spilornis cheela hoya (S. c. hoya and Pycnonotus sinensis (P. sinensis were used to illustrate this novel molecular sexing technique. The difference in the length of CHD genes in S. c. hoya and P. sinensis is 13-, and 52-bp, respectively. Using Griffiths' P2/P8 primers, molecular sexing failed both in PCR/electrophoresis of S. c. hoya and in PCR/MCA of S. c. hoya and P. sinensis. In contrast, we redesigned sex-specific primers to yield 185- and 112-bp PCR products for the CHD-Z and CHD-W genes of S. c. hoya, respectively, using PCR/MCA. Using this specific primer set, at least 13 samples of S. c. hoya were examined simultaneously and the Tm peaks of CHD-Z and CHD-W PCR products were distinguished. Conclusion In this study, we introduced a high-throughput avian molecular sexing technique and successfully applied it to two species. This new method holds a great potential for use in high throughput sexing of other avian species, as well.

  19. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  20. High-throughput SHAPE analysis reveals structures in HIV-1 genomic RNA strongly conserved across distinct biological states.

    Directory of Open Access Journals (Sweden)

    Kevin A Wilkinson

    2008-04-01

    Full Text Available Replication and pathogenesis of the human immunodeficiency virus (HIV is tightly linked to the structure of its RNA genome, but genome structure in infectious virions is poorly understood. We invent high-throughput SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension technology, which uses many of the same tools as DNA sequencing, to quantify RNA backbone flexibility at single-nucleotide resolution and from which robust structural information can be immediately derived. We analyze the structure of HIV-1 genomic RNA in four biologically instructive states, including the authentic viral genome inside native particles. Remarkably, given the large number of plausible local structures, the first 10% of the HIV-1 genome exists in a single, predominant conformation in all four states. We also discover that noncoding regions functioning in a regulatory role have significantly lower (p-value < 0.0001 SHAPE reactivities, and hence more structure, than do viral coding regions that function as the template for protein synthesis. By directly monitoring protein binding inside virions, we identify the RNA recognition motif for the viral nucleocapsid protein. Seven structurally homologous binding sites occur in a well-defined domain in the genome, consistent with a role in directing specific packaging of genomic RNA into nascent virions. In addition, we identify two distinct motifs that are targets for the duplex destabilizing activity of this same protein. The nucleocapsid protein destabilizes local HIV-1 RNA structure in ways likely to facilitate initial movement both of the retroviral reverse transcriptase from its tRNA primer and of the ribosome in coding regions. Each of the three nucleocapsid interaction motifs falls in a specific genome domain, indicating that local protein interactions can be organized by the long-range architecture of an RNA. High-throughput SHAPE reveals a comprehensive view of HIV-1 RNA genome structure, and further

  1. Cost Effectiveness Analysis, A DTIC Bibliography.

    Science.gov (United States)

    1980-07-01

    Model for Estimating * 0 6 DUGAS. DORIS J. Software Life Cycle Costs (ModelGuidelines for Attracting Private *4Concept). Volume 1.Capital to Corp$ of...of Category It Test Program A0-A023 442 An Econometric Analysis of aitonance Data. VOlunteer Enlistments of service AD-AO21 258 HUMPHREYS . THOMAS H

  2. Development of in-house methods for high-throughput DNA extraction

    Science.gov (United States)

    Given the high-throughput nature of many current biological studies, in particular field-based or applied environmental studies, optimization of cost-effective, efficient methods for molecular analysis of large numbers of samples is a critical first step. Existing methods are either based on costly ...

  3. Laser Desorption Mass Spectrometry for High Throughput DNA Analysis and Its Applications

    Energy Technology Data Exchange (ETDEWEB)

    Allman, S.L.; Chen, C.H.; Golovlev, V.V.; Isola, N.R.; Matteson, K.J.; Potter, N.T.; Taranenko, N.I.

    1999-01-23

    Laser desorption mass spectrometry (LDMS) has been developed for DNA sequencing, disease diagnosis, and DNA Fingerprinting for forensic applications. With LDMS, the speed of DNA analysis can be much faster than conventional gel electrophoresis. No dye or radioactive tagging to DNA segments for detection is needed. LDMS is emerging as a new alternative technology for DNA analysis.

  4. Validation of a High-Throughput Multiplex Genetic Detection System for Helicobacter pylori Identification, Quantification, Virulence, and Resistance Analysis

    OpenAIRE

    Zhang, Yanmei; Zhao, Fuju; Kong, Mimi; Wang, Shiwen; Nan, Li; Hu, Binjie; Olszewski, Michal A.; Miao, Yingxin; Ji, Danian; Jiang, Wenrong; Fang, Yi; Zhang, Jinghao; Chen, Fei; Xiang, Ping; Wu, Yong

    2016-01-01

    Helicobacter pylori (H. pylori) infection is closely related to various gastroduodenal diseases. Virulence factors and bacterial load of H. pylori are associated with clinical outcomes, and drug-resistance severely impacts the clinical efficacy of eradication treatment. Existing detection methods are low-throughput, time-consuming and labor intensive. Therefore, a rapid and high-throughput method is needed for clinical diagnosis, treatment, and monitoring for H. pylori. High-throughput Multip...

  5. Cost-effectiveness analysis of sandhill crane habitat management

    Science.gov (United States)

    Kessler, Andrew C.; Merchant, James W.; Shultz, Steven D.; Allen, Craig R.

    2013-01-01

    Invasive species often threaten native wildlife populations and strain the budgets of agencies charged with wildlife management. We demonstrate the potential of cost-effectiveness analysis to improve the efficiency and value of efforts to enhance sandhill crane (Grus canadensis) roosting habitat. We focus on the central Platte River in Nebraska (USA), a region of international ecological importance for migrating avian species including sandhill cranes. Cost-effectiveness analysis is a valuation process designed to compare alternative actions based on the cost of achieving a pre-determined objective. We estimated costs for removal of invasive vegetation using geographic information system simulations and calculated benefits as the increase in area of sandhill crane roosting habitat. We generated cost effectiveness values for removing invasive vegetation on 7 land parcels and for the entire central Platte River to compare the cost-effectiveness of management at specific sites and for the central Platte River landscape. Median cost effectiveness values for the 7 land parcels evaluated suggest that costs for creating 1 additional hectare of sandhill crane roosting habitat totaled US $1,595. By contrast, we found that creating an additional hectare of sandhill crane roosting habitat could cost as much as US $12,010 for some areas in the central Platte River, indicating substantial cost savings can be achieved by using a cost effectiveness analysis to target specific land parcels for management. Cost-effectiveness analysis, used in conjunction with geographic information systems, can provide decision-makers with a new tool for identifying the most economically efficient allocation of resources to achieve habitat management goals.

  6. A new high-throughput LC-MS method for the analysis of complex fructan mixtures

    DEFF Research Database (Denmark)

    Verspreet, Joran; Hansen, Anders Holmgaard; Dornez, Emmie

    2014-01-01

    In this paper, a new liquid chromatography-mass spectrometry (LC-MS) method for the analysis of complex fructan mixtures is presented. In this method, columns with a trifunctional C18 alkyl stationary phase (T3) were used and their performance compared with that of a porous graphitized carbon (PGC...

  7. Validation of QuickScan dicentric chromosome analysis for high throughput radiation biological dosimetry.

    Science.gov (United States)

    Flegal, F N; Devantier, Y; Marro, L; Wilkins, R C

    2012-02-01

    Currently, the dicentric chromosome assay (DCA) is used to estimate radiation doses to individuals following accidental radiological and nuclear overexposures when traditional dosimetry methods are not available. While being an exceptionally sensitive method for estimating doses by radiation, conventional DCA is time-intensive and requires highly trained expertise for analysis. For this reason, in a mass casualty situation, triage-quality conventional DCA struggles to provide dose estimations in a timely manner for triage purposes. In Canada, a new scoring technique, termed DCA QuickScan, has been devised to increase the throughput of this assay. DCA QuickScan uses traditional DCA sample preparation methods while adapting a rapid scoring approach. In this study, both conventional and QuickScan methods of scoring the DCA assay were compared for accuracy and sensitivity. Dose response curves were completed on four different donors based on the analysis of 1,000 metaphases or 200 events at eight to nine dose points by eight different scorers across two laboratories. Statistical analysis was performed on the data to compare the two methods within and across the laboratories and to test their respective sensitivities for dose estimation. This study demonstrated that QuickScan is statistically similar to conventional DCA analysis and is capable of producing dose estimates as low as 0.1 Gy but up to six times faster. Therefore, DCA QuickScan analysis can be used as a sensitive and accurate method for scoring samples for radiological biodosimetry in mass casualty situations or where faster dose assessment is required.

  8. High-throughput metabolic state analysis: The missing link in integrated functional genomics of yeasts

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Moxley, Joel. F; Åkesson, Mats Fredrik

    2005-01-01

    The lack of comparable metabolic state assays severely limits understanding the metabolic changes caused by genetic or environmental perturbations. The present study reports the application of a novel derivatization method for metabolome analysis of yeast, coupled to data-mining software that ach......The lack of comparable metabolic state assays severely limits understanding the metabolic changes caused by genetic or environmental perturbations. The present study reports the application of a novel derivatization method for metabolome analysis of yeast, coupled to data-mining software...... that achieve comparable throughput, effort and cost compared with DNA arrays. Our sample workup method enables simultaneous metabolite measurements throughout central carbon metabolism and amino acid biosynthesis, using a standard GC-MS platform that was optimized for this Purpose. As an implementation proof...

  9. GiA Roots: software for the high throughput analysis of plant root system architecture

    OpenAIRE

    Galkovskyi Taras; Mileyko Yuriy; Bucksch Alexander; Moore Brad; Symonova Olga; Price Charles A; Topp Christopher N; Iyer-Pascuzzi Anjali S; Zurek Paul R; Fang Suqin; Harer John; Benfey Philip N; Weitz Joshua S

    2012-01-01

    Abstract Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically...

  10. 'PACLIMS': A component LIM system for high-throughput functional genomic analysis

    OpenAIRE

    Farman Mark; Patel Gayatri; Orbach Marc J; Tucker Sara; Galadima Natalia; Mitchell Thomas; Floyd Anna; Nolin Shelly; Windham Donald; Diener Stephen; Brown Douglas; Rajagopalon Ravi; Donofrio Nicole; Pampanwar Vishal; Soderlund Cari

    2005-01-01

    Abstract Background Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the...

  11. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-08-08

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. High Throughput Phenotypic Analysis of Mycobacterium tuberculosis and Mycobacterium bovis Strains' Metabolism Using Biolog Phenotype Microarrays

    OpenAIRE

    Khatri, Bhagwati; Fielder, Mark; Jones, Gareth; Newell, William; Abu-Oun, Manal; Wheeler, Paul R.

    2013-01-01

    Tuberculosis is a major human and animal disease of major importance worldwide. Genetically, the closely related strains within the Mycobacterium tuberculosis complex which cause disease are well-characterized but there is an urgent need better to understand their phenotypes. To search rapidly for metabolic differences, a working method using Biolog Phenotype MicroArray analysis was developed. Of 380 substrates surveyed, 71 permitted tetrazolium dye reduction, the readout over 7 days in the m...

  13. BABELOMICS: a suite of web tools for functional annotation and analysis of groups of genes in high-throughput experiments.

    Science.gov (United States)

    Al-Shahrour, Fátima; Minguez, Pablo; Vaquerizas, Juan M; Conde, Lucía; Dopazo, Joaquín

    2005-07-01

    We present Babelomics, a complete suite of web tools for the functional analysis of groups of genes in high-throughput experiments, which includes the use of information on Gene Ontology terms, interpro motifs, KEGG pathways, Swiss-Prot keywords, analysis of predicted transcription factor binding sites, chromosomal positions and presence in tissues with determined histological characteristics, through five integrated modules: FatiGO (fast assignment and transference of information), FatiWise, transcription factor association test, GenomeGO and tissues mining tool, respectively. Additionally, another module, FatiScan, provides a new procedure that integrates biological information in combination with experimental results in order to find groups of genes with modest but coordinate significant differential behaviour. FatiScan is highly sensitive and is capable of finding significant asymmetries in the distribution of genes of common function across a list of ordered genes even if these asymmetries were not extreme. The strong multiple-testing nature of the contrasts made by the tools is taken into account. All the tools are integrated in the gene expression analysis package GEPAS. Babelomics is the natural evolution of our tool FatiGO (which analysed almost 22,000 experiments during the last year) to include more sources on information and new modes of using it. Babelomics can be found at http://www.babelomics.org.

  14. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  15. High-throughput sequence analysis reveals structural diversity and improved potency among RNA inhibitors of HIV reverse transcriptase.

    Science.gov (United States)

    Ditzler, Mark A; Lange, Margaret J; Bose, Debojit; Bottoms, Christopher A; Virkler, Katherine F; Sawyer, Andrew W; Whatley, Angela S; Spollen, William; Givan, Scott A; Burke, Donald H

    2013-02-01

    Systematic evolution of ligands through exponential enrichment (SELEX) is a well-established method for generating nucleic acid populations that are enriched for specified functions. High-throughput sequencing (HTS) enhances the power of comparative sequence analysis to reveal details of how RNAs within these populations recognize their targets. We used HTS analysis to evaluate RNA populations selected to bind type I human immunodeficiency virus reverse transcriptase (RT). The populations are enriched in RNAs of independent lineages that converge on shared motifs and in clusters of RNAs with nearly identical sequences that share common ancestry. Both of these features informed inferences of the secondary structures of enriched RNAs, their minimal structural requirements and their stabilities in RT-aptamer complexes. Monitoring population dynamics in response to increasing selection pressure revealed RNA inhibitors of RT that are more potent than the previously identified pseudoknots. Improved potency was observed for inhibition of both purified RT in enzymatic assays and viral replication in cell-based assays. Structural and functional details of converged motifs that are obscured by simple consensus descriptions are also revealed by the HTS analysis. The approach presented here can readily be generalized for the efficient and systematic post-SELEX development of aptamers for down-stream applications.

  16. Phyloseq: a bioconductor package for handling and analysis of high-throughput phylogenetic sequence data.

    Science.gov (United States)

    McMurdie, Paul J; Holmes, Susan

    2012-01-01

    We present a detailed description of a new Bioconductor package, phyloseq, for integrated data and analysis of taxonomically-clustered phylogenetic sequencing data in conjunction with related data types. The phyloseq package integrates abundance data, phylogenetic information and covariates so that exploratory transformations, plots, and confirmatory testing and diagnostic plots can be carried out seamlessly. The package is built following the S4 object-oriented framework of the R language so that once the data have been input the user can easily transform, plot and analyze the data. We present some examples that highlight the methods and the ease with which we can leverage existing packages.

  17. High-throughput mathematical analysis identifies Turing networks for patterning with equally diffusing signals.

    Science.gov (United States)

    Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick

    2016-04-08

    The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems.

  18. Transcript analysis of 1003 novel yeast genes using high-throughput northern hybridizations.

    Science.gov (United States)

    Brown, A J; Planta, R J; Restuhadi, F; Bailey, D A; Butler, P R; Cadahia, J L; Cerdan, M E; De Jonge, M; Gardner, D C; Gent, M E; Hayes, A; Kolen, C P; Lombardia, L J; Murad, A M; Oliver, R A; Sefton, M; Thevelein, J M; Tournu, H; van Delft, Y J; Verbart, D J; Winderickx, J; Oliver, S G

    2001-06-15

    The expression of 1008 open reading frames (ORFs) from the yeast Saccharomyces cerevisiae has been examined under eight different physiological conditions, using classical northern analysis. These northern data have been compared with publicly available data from a microarray analysis of the diauxic transition in S.cerevisiae. The results demonstrate the importance of comparing biologically equivalent situations and of the standardization of data normalization procedures. We have also used our northern data to identify co-regulated gene clusters and define the putative target sites of transcriptional activators responsible for their control. Clusters containing genes of known function identify target sites of known activators. In contrast, clusters comprised solely of genes of unknown function usually define novel putative target sites. Finally, we have examined possible global controls on gene expression. It was discovered that ORFs that are highly expressed following a nutritional upshift tend to employ favoured codons, whereas those overexpressed in starvation conditions do not. These results are interpreted in terms of a model in which competition between mRNA molecules for translational capacity selects for codons translated by abundant tRNAs.

  19. High throughput nanoparticle tracking analysis for monitoring outer membrane vesicle production.

    Science.gov (United States)

    Gerritzen, Matthias J H; Martens, Dirk E; Wijffels, René H; Stork, Michiel

    2017-01-01

    Outer membrane vesicles (OMVs) are spherical membrane nanoparticles released by Gram-negative bacteria. OMVs can be quantified in complex matrices by nanoparticle tracking analysis (NTA). NTA can be performed in static mode or with continuous sample flow that results in analysis of more particles in a smaller time-frame. Flow measurements must be performed manually despite the availability of a sample changer on the NanoSight system. Here we present a method for automated measurements in flow mode. OMV quantification in flow mode results in lower variance in particle quantification (coefficient of variation (CV) of 6%, CV static measurements of 14%). Sizing of OMVs was expected to be less favorable in flow mode due to the increased movement of the particles. However, we observed a CV of 3% in flow mode and a CV of 8% in static measurements. Flow rates of up to 5 µL/min displayed correct size and particle measurements, however, particle concentration was slightly lower than in static measurements. The automated method was used to assess OMV release of batch cultures of Neisseria meningitidis. The bacteria released more OMVs in stationary growth phase, while the size of the vesicles remained constant throughout the culture. Taken together, this study shows that automated measurements in flow mode can be established with advanced scripting to reduce the workload for the user.

  20. High throughput phenotypic analysis of Mycobacterium tuberculosis and Mycobacterium bovis strains' metabolism using biolog phenotype microarrays.

    Directory of Open Access Journals (Sweden)

    Bhagwati Khatri

    Full Text Available Tuberculosis is a major human and animal disease of major importance worldwide. Genetically, the closely related strains within the Mycobacterium tuberculosis complex which cause disease are well-characterized but there is an urgent need better to understand their phenotypes. To search rapidly for metabolic differences, a working method using Biolog Phenotype MicroArray analysis was developed. Of 380 substrates surveyed, 71 permitted tetrazolium dye reduction, the readout over 7 days in the method. By looking for ≥5-fold differences in dye reduction, 12 substrates differentiated M. tuberculosis H37Rv and Mycobacterium bovis AF2122/97. H37Rv and a Beijing strain of M. tuberculosis could also be distinguished in this way, as could field strains of M. bovis; even pairs of strains within one spoligotype could be distinguished by 2 to 3 substrates. Cluster analysis gave three clear groups: H37Rv, Beijing, and all the M. bovis strains. The substrates used agreed well with prior knowledge, though an unexpected finding that AF2122/97 gave greater dye reduction than H37Rv with hexoses was investigated further, in culture flasks, revealing that hexoses and Tween 80 were synergistic for growth and used simultaneously rather than in a diauxic fashion. Potential new substrates for growth media were revealed, too, most promisingly N-acetyl glucosamine. Osmotic and pH arrays divided the mycobacteria into two groups with different salt tolerance, though in contrast to the substrate arrays the groups did not entirely correlate with taxonomic differences. More interestingly, these arrays suggested differences between the amines used by the M. tuberculosis complex and enteric bacteria in acid tolerance, with some hydrophobic amino acids being highly effective. In contrast, γ-aminobutyrate, used in the enteric bacteria, had no effect in the mycobacteria. This study proved principle that Phenotype MicroArrays can be used with slow-growing pathogenic mycobacteria

  1. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  2. MCAM: multiple clustering analysis methodology for deriving hypotheses and insights from high-throughput proteomic datasets.

    Directory of Open Access Journals (Sweden)

    Kristen M Naegle

    2011-07-01

    Full Text Available Advances in proteomic technologies continue to substantially accelerate capability for generating experimental data on protein levels, states, and activities in biological samples. For example, studies on receptor tyrosine kinase signaling networks can now capture the phosphorylation state of hundreds to thousands of proteins across multiple conditions. However, little is known about the function of many of these protein modifications, or the enzymes responsible for modifying them. To address this challenge, we have developed an approach that enhances the power of clustering techniques to infer functional and regulatory meaning of protein states in cell signaling networks. We have created a new computational framework for applying clustering to biological data in order to overcome the typical dependence on specific a priori assumptions and expert knowledge concerning the technical aspects of clustering. Multiple clustering analysis methodology ('MCAM' employs an array of diverse data transformations, distance metrics, set sizes, and clustering algorithms, in a combinatorial fashion, to create a suite of clustering sets. These sets are then evaluated based on their ability to produce biological insights through statistical enrichment of metadata relating to knowledge concerning protein functions, kinase substrates, and sequence motifs. We applied MCAM to a set of dynamic phosphorylation measurements of the ERRB network to explore the relationships between algorithmic parameters and the biological meaning that could be inferred and report on interesting biological predictions. Further, we applied MCAM to multiple phosphoproteomic datasets for the ERBB network, which allowed us to compare independent and incomplete overlapping measurements of phosphorylation sites in the network. We report specific and global differences of the ERBB network stimulated with different ligands and with changes in HER2 expression. Overall, we offer MCAM as a broadly

  3. MCAM: Multiple Clustering Analysis Methodology for Deriving Hypotheses and Insights from High-Throughput Proteomic Datasets

    Science.gov (United States)

    Naegle, Kristen M.; Welsch, Roy E.; Yaffe, Michael B.; White, Forest M.; Lauffenburger, Douglas A.

    2011-01-01

    Advances in proteomic technologies continue to substantially accelerate capability for generating experimental data on protein levels, states, and activities in biological samples. For example, studies on receptor tyrosine kinase signaling networks can now capture the phosphorylation state of hundreds to thousands of proteins across multiple conditions. However, little is known about the function of many of these protein modifications, or the enzymes responsible for modifying them. To address this challenge, we have developed an approach that enhances the power of clustering techniques to infer functional and regulatory meaning of protein states in cell signaling networks. We have created a new computational framework for applying clustering to biological data in order to overcome the typical dependence on specific a priori assumptions and expert knowledge concerning the technical aspects of clustering. Multiple clustering analysis methodology (‘MCAM’) employs an array of diverse data transformations, distance metrics, set sizes, and clustering algorithms, in a combinatorial fashion, to create a suite of clustering sets. These sets are then evaluated based on their ability to produce biological insights through statistical enrichment of metadata relating to knowledge concerning protein functions, kinase substrates, and sequence motifs. We applied MCAM to a set of dynamic phosphorylation measurements of the ERRB network to explore the relationships between algorithmic parameters and the biological meaning that could be inferred and report on interesting biological predictions. Further, we applied MCAM to multiple phosphoproteomic datasets for the ERBB network, which allowed us to compare independent and incomplete overlapping measurements of phosphorylation sites in the network. We report specific and global differences of the ERBB network stimulated with different ligands and with changes in HER2 expression. Overall, we offer MCAM as a broadly

  4. High throughput phenotypic analysis of Mycobacterium tuberculosis and Mycobacterium bovis strains' metabolism using biolog phenotype microarrays.

    Science.gov (United States)

    Khatri, Bhagwati; Fielder, Mark; Jones, Gareth; Newell, William; Abu-Oun, Manal; Wheeler, Paul R

    2013-01-01

    Tuberculosis is a major human and animal disease of major importance worldwide. Genetically, the closely related strains within the Mycobacterium tuberculosis complex which cause disease are well-characterized but there is an urgent need better to understand their phenotypes. To search rapidly for metabolic differences, a working method using Biolog Phenotype MicroArray analysis was developed. Of 380 substrates surveyed, 71 permitted tetrazolium dye reduction, the readout over 7 days in the method. By looking for ≥5-fold differences in dye reduction, 12 substrates differentiated M. tuberculosis H37Rv and Mycobacterium bovis AF2122/97. H37Rv and a Beijing strain of M. tuberculosis could also be distinguished in this way, as could field strains of M. bovis; even pairs of strains within one spoligotype could be distinguished by 2 to 3 substrates. Cluster analysis gave three clear groups: H37Rv, Beijing, and all the M. bovis strains. The substrates used agreed well with prior knowledge, though an unexpected finding that AF2122/97 gave greater dye reduction than H37Rv with hexoses was investigated further, in culture flasks, revealing that hexoses and Tween 80 were synergistic for growth and used simultaneously rather than in a diauxic fashion. Potential new substrates for growth media were revealed, too, most promisingly N-acetyl glucosamine. Osmotic and pH arrays divided the mycobacteria into two groups with different salt tolerance, though in contrast to the substrate arrays the groups did not entirely correlate with taxonomic differences. More interestingly, these arrays suggested differences between the amines used by the M. tuberculosis complex and enteric bacteria in acid tolerance, with some hydrophobic amino acids being highly effective. In contrast, γ-aminobutyrate, used in the enteric bacteria, had no effect in the mycobacteria. This study proved principle that Phenotype MicroArrays can be used with slow-growing pathogenic mycobacteria and already has

  5. A novel approach for transcription factor analysis using SELEX with high-throughput sequencing (TFAST.

    Directory of Open Access Journals (Sweden)

    Daniel J Reiss

    Full Text Available BACKGROUND: In previous work, we designed a modified aptamer-free SELEX-seq protocol (afSELEX-seq for the discovery of transcription factor binding sites. Here, we present original software, TFAST, designed to analyze afSELEX-seq data, validated against our previously generated afSELEX-seq dataset and a model dataset. TFAST is designed with a simple graphical interface (Java so that it can be installed and executed without extensive expertise in bioinformatics. TFAST completes analysis within minutes on most personal computers. METHODOLOGY: Once afSELEX-seq data are aligned to a target genome, TFAST identifies peaks and, uniquely, compares peak characteristics between cycles. TFAST generates a hierarchical report of graded peaks, their associated genomic sequences, binding site length predictions, and dummy sequences. PRINCIPAL FINDINGS: Including additional cycles of afSELEX-seq improved TFAST's ability to selectively identify peaks, leading to 7,274, 4,255, and 2,628 peaks identified in two-, three-, and four-cycle afSELEX-seq. Inter-round analysis by TFAST identified 457 peaks as the strongest candidates for true binding sites. Separating peaks by TFAST into classes of worst, second-best and best candidate peaks revealed a trend of increasing significance (e-values 4.5 × 10(12, 2.9 × 10(-46, and 1.2 × 10(-73 and informational content (11.0, 11.9, and 12.5 bits over 15 bp of discovered motifs within each respective class. TFAST also predicted a binding site length (28 bp consistent with non-computational experimentally derived results for the transcription factor PapX (22 to 29 bp. CONCLUSIONS/SIGNIFICANCE: TFAST offers a novel and intuitive approach for determining DNA binding sites of proteins subjected to afSELEX-seq. Here, we demonstrate that TFAST, using afSELEX-seq data, rapidly and accurately predicted sequence length and motif for a putative transcription factor's binding site.

  6. High throughput genetic analysis of congenital myasthenic syndromes using resequencing microarrays.

    Directory of Open Access Journals (Sweden)

    Lisa Denning

    Full Text Available BACKGROUND: The use of resequencing microarrays for screening multiple, candidate disease loci is a promising alternative to conventional capillary sequencing. We describe the performance of a custom resequencing microarray for mutational analysis of Congenital Myasthenic Syndromes (CMSs, a group of disorders in which the normal process of neuromuscular transmission is impaired. METHODOLOGY/PRINCIPAL FINDINGS: Our microarray was designed to assay the exons and flanking intronic regions of 8 genes linked to CMSs. A total of 31 microarrays were hybridized with genomic DNA from either individuals with known CMS mutations or from healthy controls. We estimated an overall microarray call rate of 93.61%, and we found the percentage agreement between the microarray and capillary sequencing techniques to be 99.95%. In addition, our microarray exhibited 100% specificity and 99.99% reproducibility. Finally, the microarray detected 22 out of the 23 known missense mutations, but it failed to detect all 7 known insertion and deletion (indels mutations, indicating an overall sensitivity of 73.33% and a sensitivity with respect to missense mutations of 95.65%. CONCLUSIONS/SIGNIFICANCE: Overall, our microarray prototype exhibited strong performance and proved highly efficient for screening genes associated with CMSs. Until indels can be efficiently assayed with this technology, however, we recommend using resequencing microarrays for screening CMS mutations after common indels have been first assayed by capillary sequencing.

  7. Differentiation and identification of filamentous fungi by high-throughput FTIR spectroscopic analysis of mycelia.

    Science.gov (United States)

    Lecellier, A; Mounier, J; Gaydou, V; Castrec, L; Barbier, G; Ablain, W; Manfait, M; Toubas, D; Sockalingum, G D

    2014-01-03

    Routine identification of fungi based on phenotypic and genotypic methods can be fastidious and time-consuming. In this context, there is a constant need for new approaches allowing the rapid identification of molds. Fourier-transform infrared (FTIR) spectroscopy appears as such an indicated method. The objective of this work was to evaluate the potential of FTIR spectroscopy for an early differentiation and identification of filamentous fungi. One hundred and thirty-one strains identified using DNA sequencing, were analyzed using FTIR spectroscopy of the mycelia obtained after a reduced culture time of 48 h compared to current conventional methods. Partial least square discriminant analysis was used as a chemometric method to analyze the spectral data and for identification of the fungal strains from the phylum to the species level. Calibration models were constructed using 106 strains pertaining to 14 different genera and 32 species and were used to identify 25 fungal strains in a blind manner. Identification levels of 98.97% and 98.77% achieved were correctly assigned to the genus and species levels respectively. FTIR spectroscopy with its high discriminating power and rapidity therefore shows strong promise for routine fungal identification. Upgrading of our database is ongoing to test the technique's robustness. © 2013.

  8. Quantitative proteomic analysis for high-throughput screening of differential glycoproteins in hepatocellular carcinoma serum

    Institute of Scientific and Technical Information of China (English)

    Hua-Jun Gao; Ya-Jing Chen; Duo Zuo; Ming-Ming Xiao; Ying Li; Hua Guo; Ning Zhang; Rui-Bing Chen

    2015-01-01

    Objective:Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Methods:Lectin affnity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. Results:A total of 2,280 protein groups were identiifed in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1, 6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. Conclusion:A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC.

  9. Quantitative proteomic analysis for high-throughput screening of differential glycoproteins in hepatocellular carcinoma serum

    Science.gov (United States)

    Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing

    2015-01-01

    Objective Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Methods Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. Results A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. Conclusion A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC. PMID:26487969

  10. Rapid generation of single-tumor spheroids for high-throughput cell function and toxicity analysis.

    Science.gov (United States)

    Ivascu, Andrea; Kubbies, Manfred

    2006-12-01

    Spheroids are widely used in biology because they provide an in vitro 3-dimensional (3D) model to study proliferation, cell death, differentiation, and metabolism of cells in tumors and the response of tumors to radiotherapy and chemotherapy. The methods of generating spheroids are limited by size heterogeneity, long cultivation time, or mechanical accessibility for higher throughput fashion. The authors present a rapid method to generate single spheroids in suspension culture in individual wells. A defined number of cells ranging from 1000 to 20,000 were seeded into wells of poly-HEMA-coated, 96-well, round-or conical-bottom plates in standard medium and centrifuged for 10 min at 1000 g. This procedure generates single spheroids in each well within a 24-h culture time with homogeneous sizes, morphologies, and stratification of proliferating cells in the rim and dying cells in the core region. Because a large number of tumor cell lines form only loose aggregates when cultured in 3D, the authors also performed a screen for medium additives to achieve a switch from aggregate to spheroid morphology. Small quantities of the basement membrane extract Matrigel, added to the culture medium prior to centrifugation, most effectively induced compact spheroid formation. The compact spheroid morphology is evident as early as 24 h after centrifugation in a true suspension culture. Twenty tumor cell lines of different lineages have been used to successfully generate compact, single spheroids with homogenous size in 96-well plates and are easily accessible for subsequent functional analysis.

  11. High-Throughput, Automated Protein A Purification Platform with Multiattribute LC-MS Analysis for Advanced Cell Culture Process Monitoring.

    Science.gov (United States)

    Dong, Jia; Migliore, Nicole; Mehrman, Steven J; Cunningham, John; Lewis, Michael J; Hu, Ping

    2016-09-06

    The levels of many product related variants observed during the production of monoclonal antibodies are dependent on control of the manufacturing process, especially the cell culture process. However, it is difficult to characterize samples pulled from the bioreactor due to the low levels of product during the early stages of the process and the high levels of interfering reagents. Furthermore, analytical results are often not available for several days, which slows the process development cycle and prevents "real time" adjustments to the manufacturing process. To reduce the delay and enhance our ability to achieve quality targets, we have developed a low-volume, high-throughput, and high-content analytical platform for at-line product quality analysis. This workflow includes an automated, 96-well plate protein A purification step to isolate antibody product from the cell culture fermentation broth, followed by rapid, multiattribute LC-MS analysis. We have demonstrated quantitative correlations between particular process parameters with the levels of glycosylated and glycated species in a series of small scale experiments, but the platform could be used to monitor other attributes and applied across the biopharmaceutical industry.

  12. High throughput sequencing analysis of RNA libraries reveals the influences of initial library and PCR methods on SELEX efficiency.

    Science.gov (United States)

    Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J; Burnett, John C; Zhou, Jiehua

    2016-09-22

    The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct "biased sequences" and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the "biased sequences" was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy.

  13. High-Throughput Single-Cell Derived Sphere Formation for Cancer Stem-Like Cell Identification and Analysis

    Science.gov (United States)

    Chen, Yu-Chih; Ingram, Patrick N.; Fouladdel, Shamileh; McDermott, Sean P.; Azizi, Ebrahim; Wicha, Max S.; Yoon, Euisik

    2016-06-01

    Considerable evidence suggests that many malignancies are driven by a cellular compartment that displays stem cell properties. Cancer stem-like cells (CSCs) can be identified by expression of cell surface markers or enzymatic activity, but these methods are limited by phenotypic heterogeneity and plasticity of CSCs. An alternative phenotypic methodology based on in-vitro sphere formation has been developed, but it is typically labor-intensive and low-throughput. In this work, we present a 1,024-microchamber microfluidic platform for single-cell derived sphere formation. Utilizing a hydrodynamic capturing scheme, more than 70% of the microchambers capture only one cell, allowing for monitoring of sphere formation from heterogeneous cancer cell populations for identification of CSCs. Single-cell derived spheres can be retrieved and dissociated for single-cell analysis using a custom 96-gene panel to probe heterogeneity within the clonal CSC spheres. This microfluidic platform provides reliable and high-throughput sphere formation for CSC identification and downstream clonal analysis.

  14. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  15. High throughput Sr isotope analysis using an automated column chemistry system

    Science.gov (United States)

    Mackey, G. N.; Fernandez, D.

    2011-12-01

    A new method has been developed for rapidly measuring 87Sr/86Sr isotope ratios using an autosampler that automates column chemistry for Sr purification. The autosampler, a SC2 DX with FAST2 valve block, produced by Elemental Scientific, Inc., utilizes a pair of six-way valves, a vacuum, and a peristaltic pump to load a sample from an autosampler tube onto the Eichrom Sr Resin in the separation column. The autosampler then elutes the sample from the column directly into the spray chamber of the mass spectrometer. Measurements are made on a Thermo-Finnigan Neptune ICP-MS. Sample-blank pairs require approximately 30 minutes for analysis. Normal throughput for the system is 24 samples and 11 standards per day. Adjustment of the pump speed allows for rapid loading of the column followed by a 3-minute data acquisition period with no fractionation of the Sr being eluted from the column. All data are blank-, interference-, and normalization-corrected online using 86Sr/88Sr = 0.1194. Analytical precision on a typical 66 ng/g analysis is ±0.00003 (2σ SE). Reproducibility of the SRM987 Sr standard (66 ng/g) over the course of a typical sequence is ±0.00004 (2σ SD, n=11). For comparison, offline column separation of the SRM987 Sr standard (66 ng/g) was conducted and measured using the same instrument method, yielding a reproducibility of ±0.00004 (2σ SD, n=7). The long-term average of the SRM987 standard (10-200 ng/g) utilizing the online column chemistry method is 0.71027 ± 0.00010 (2σ SD, n=239). A small memory effect has been measured by alternating spiked samples (87Sr/86Sr = 0.67465) with the SRM987 standard. The bias measured in this test (87Sr/86Sr +0.00006) slightly exceeds the 2σ standard reproducibility for a typical run with sample and standard concentrations near 66 ng/g, but is within the 2σ long-term reproducibility of the method. The optimal concentration range for the offline column chemistry system is 50-250 ng/g Sr. Sample concentrations above 250

  16. Live imaging of muscles in Drosophila metamorphosis: Towards high-throughput gene identification and function analysis.

    Science.gov (United States)

    Puah, Wee Choo; Wasser, Martin

    2016-03-01

    Time-lapse microscopy in developmental biology is an emerging tool for functional genomics. Phenotypic effects of gene perturbations can be studied non-invasively at multiple time points in chronological order. During metamorphosis of Drosophila melanogaster, time-lapse microscopy using fluorescent reporters allows visualization of alternative fates of larval muscles, which are a model for the study of genes related to muscle wasting. While doomed muscles enter hormone-induced programmed cell death, a smaller population of persistent muscles survives to adulthood and undergoes morphological remodeling that involves atrophy in early, and hypertrophy in late pupation. We developed a method that combines in vivo imaging, targeted gene perturbation and image analysis to identify and characterize genes involved in muscle development. Macrozoom microscopy helps to screen for interesting muscle phenotypes, while confocal microscopy in multiple locations over 4-5 days produces time-lapse images that are used to quantify changes in cell morphology. Performing a similar investigation using fixed pupal tissues would be too time-consuming and therefore impractical. We describe three applications of our pipeline. First, we show how quantitative microscopy can track and measure morphological changes of muscle throughout metamorphosis and analyze genes involved in atrophy. Second, our assay can help to identify genes that either promote or prevent histolysis of abdominal muscles. Third, we apply our approach to test new fluorescent proteins as live markers for muscle development. We describe mKO2 tagged Cysteine proteinase 1 (Cp1) and Troponin-I (TnI) as examples of proteins showing developmental changes in subcellular localization. Finally, we discuss strategies to improve throughput of our pipeline to permit genome-wide screens in the future.

  17. Region Templates: Data Representation and Management for High-Throughput Image Analysis.

    Science.gov (United States)

    Teodoro, George; Pan, Tony; Kurc, Tahsin; Kong, Jun; Cooper, Lee; Klasky, Scott; Saltz, Joel

    2014-12-01

    We introduce a region template abstraction and framework for the efficient storage, management and processing of common data types in analysis of large datasets of high resolution images on clusters of hybrid computing nodes. The region template abstraction provides a generic container template for common data structures, such as points, arrays, regions, and object sets, within a spatial and temporal bounding box. It allows for different data management strategies and I/O implementations, while providing a homogeneous, unified interface to applications for data storage and retrieval. A region template application is represented as a hierarchical dataflow in which each computing stage may be represented as another dataflow of finer-grain tasks. The execution of the application is coordinated by a runtime system that implements optimizations for hybrid machines, including performance-aware scheduling for maximizing the utilization of computing devices and techniques to reduce the impact of data transfers between CPUs and GPUs. An experimental evaluation on a state-of-the-art hybrid cluster using a microscopy imaging application shows that the abstraction adds negligible overhead (about 3%) and achieves good scalability and high data transfer rates. Optimizations in a high speed disk based storage implementation of the abstraction to support asynchronous data transfers and computation result in an application performance gain of about 1.13×. Finally, a processing rate of 11,730 4K×4K tiles per minute was achieved for the microscopy imaging application on a cluster with 100 nodes (300 GPUs and 1,200 CPU cores). This computation rate enables studies with very large datasets.

  18. High-Throughput Genetic Analysis and Combinatorial Chiral Separations Based on Capillary Electrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Wenwan Zhong

    2003-08-05

    Capillary electrophoresis (CE) offers many advantages over conventional analytical methods, such as speed, simplicity, high resolution, low cost, and small sample consumption, especially for the separation of enantiomers. However, chiral method developments still can be time consuming and tedious. They designed a comprehensive enantioseparation protocol employing neutral and sulfated cyclodextrins as chiral selectors for common basic, neutral, and acidic compounds with a 96-capillary array system. By using only four judiciously chosen separation buffers, successful enantioseparations were achieved for 49 out of 54 test compounds spanning a large variety of pKs and structures. Therefore, unknown compounds can be screened in this manner to identify optimal enantioselective conditions in just one rn. In addition to superior separation efficiency for small molecules, CE is also the most powerful technique for DNA separations. Using the same multiplexed capillary system with UV absorption detection, the sequence of a short DNA template can be acquired without any dye-labels. Two internal standards were utilized to adjust the migration time variations among capillaries, so that the four electropherograms for the A, T, C, G Sanger reactions can be aligned and base calling can be completed with a high level of confidence. the CE separation of DNA can be applied to study differential gene expression as well. Combined with pattern recognition techniques, small variations among electropherograms obtained by the separation of cDNA fragments produced from the total RNA samples of different human tissues can be revealed. These variations reflect the differences in total RNA expression among tissues. Thus, this Ce-based approach can serve as an alternative to the DNA array techniques in gene expression analysis.

  19. Single particle analysis integrated with microscopy: a high-throughput approach for reconstructing icosahedral particles.

    Science.gov (United States)

    Yan, Xiaodong; Cardone, Giovanni; Zhang, Xing; Zhou, Z Hong; Baker, Timothy S

    2014-04-01

    In cryo-electron microscopy and single particle analysis, data acquisition and image processing are generally carried out in sequential steps and computation of a three-dimensional reconstruction only begins once all the micrographs have been acquired. We are developing an integrated system for processing images of icosahedral particles during microscopy to provide reconstructed density maps in real-time at the highest possible resolution. The system is designed as a combination of pipelines to run in parallel on a computer cluster and analyzes micrographs as they are acquired, handling automatically all the processing steps from defocus estimation and particle picking to origin/orientation determination. An ab initio model is determined independently from the first micrographs collected, and new models are generated as more particles become available. As a proof of concept, we simulated data acquisition sessions using three sets of micrographs of good to excellent quality that were previously recorded from different icosahedral viruses. Results show that the processing of single micrographs can keep pace with an acquisition rate of about two images per minute. The reconstructed density map improves steadily during the image acquisition phase and its quality at the end of data collection is only moderately inferior to that obtained by expert users who processed semi-automatically all the micrographs after the acquisition. The current prototype demonstrates the advantages of integrating three-dimensional image processing with microscopy, which include an ability to monitor acquisition in terms of the final structure and to predict how much data and microscope resources are needed to achieve a desired resolution.

  20. High-throughput quantitative analysis with cell growth kinetic curves for low copy number mutant cells.

    Science.gov (United States)

    Xing, James Z; Gabos, Stephan; Huang, Biao; Pan, Tianhong; Huang, Min; Chen, Jie

    2012-10-01

    The mutation rate in cells induced by environmental genotoxic hazards is very low and difficult to detect using traditional cell counting assays. The established genetic toxicity tests currently recognized by regulatory authorities, such as conventional Ames and hypoxanthine guanine phosphoribosyl-transferase (HPRT) assays, are not well suited for higher-throughput screening as they require large amounts of test compounds and are very time consuming. In this study, we developed a novel cell-based assay for quantitative analysis of low numbers of cell copies with HPRT mutation induced by an environmental mutagen. The HPRT gene mutant cells induced by the mutagen were selected by 6-thioguanine (6-TG) and the cell's kinetic growth curve monitored by a real-time cell electronic sensor (RT-CES) system. When a threshold is set at a certain cell index (CI) level, samples with different initial mutant cell copies take different amounts of time in order for their growth (or CI accumulation) to cross this threshold. The more cells that are initially seeded in the test well, the faster the cell accumulation and therefore the shorter the time required to cross this threshold. Therefore, the culture time period required to cross the threshold of each sample corresponds to the original number of cells in the sample. A mutant cell growth time threshold (MT) value of each sample can be calculated to predict the number of original mutant cells. For mutagenesis determination, the RT-CES assay displayed an equal sensitivity (p > 0.05) and coefficients of variation values with good correlation to conventional HPRT mutagenic assays. Most importantly, the RT-CES mutation assay has a higher throughput than conventional cellular assays.

  1. Hydrogel Based 3-Dimensional (3D System for Toxicity and High-Throughput (HTP Analysis for Cultured Murine Ovarian Follicles.

    Directory of Open Access Journals (Sweden)

    Hong Zhou

    Full Text Available Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN, preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR. The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased

  2. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles.

    Science.gov (United States)

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  3. Plasmidome-analysis of ESBL-producing escherichia coli using conventional typing and high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Alma Brolund

    Full Text Available Infections caused by Extended spectrum β-lactamase (ESBL-producing E. coli are an emerging global problem, threatening the effectiveness of the extensively used β-lactam antibiotics. ESBL dissemination is facilitated by plasmids, transposons, and other mobile elements. We have characterized the plasmid content of ESBL-producing E. coli from human urinary tract infections. Ten diverse isolates were selected; they had unrelated pulsed-field gel electrophoresis (PFGE types (<90% similarity, were from geographically dispersed locations and had diverging antibiotic resistance profiles. Three isolates belonged to the globally disseminated sequence type ST131. ESBL-genes of the CTX-M-1 and CTX-M-9 phylogroups were identified in all ten isolates. The plasmid content (plasmidome of each strain was analyzed using a combination of molecular methods and high-throughput sequencing. Hidden Markov Model-based analysis of unassembled sequencing reads was used to analyze the genetic diversity of the plasmid samples and to detect resistance genes. Each isolate contained between two and eight distinct plasmids, and at least 22 large plasmids were identified overall. The plasmids were variants of pUTI89, pKF3-70, pEK499, pKF3-140, pKF3-70, p1ESCUM, pEK204, pHK17a, p083CORR, R64, pLF82, pSFO157, and R721. In addition, small cryptic high copy-number plasmids were frequent, containing one to seven open reading frames per plasmid. Three clustered groups of such small cryptic plasmids could be distinguished based on sequence similarity. Extrachromosomal prophages were found in three isolates. Two of them resembled the E. coli P1 phage and one was previously unknown. The present study confirms plasmid multiplicity in multi-resistant E. coli. We conclude that high-throughput sequencing successfully provides information on the extrachromosomal gene content and can be used to generate a genetic fingerprint of possible use in epidemiology. This could be a valuable tool for

  4. MSP-HTPrimer: a high-throughput primer design tool to improve assay design for DNA methylation analysis in epigenetics.

    Science.gov (United States)

    Pandey, Ram Vinay; Pulverer, Walter; Kallmeyer, Rainer; Beikircher, Gabriel; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Bisulfite (BS) conversion-based and methylation-sensitive restriction enzyme (MSRE)-based PCR methods have been the most commonly used techniques for locus-specific DNA methylation analysis. However, both methods have advantages and limitations. Thus, an integrated approach would be extremely useful to quantify the DNA methylation status successfully with great sensitivity and specificity. Designing specific and optimized primers for target regions is the most critical and challenging step in obtaining the adequate DNA methylation results using PCR-based methods. Currently, no integrated, optimized, and high-throughput methylation-specific primer design software methods are available for both BS- and MSRE-based methods. Therefore an integrated, powerful, and easy-to-use methylation-specific primer design pipeline with great accuracy and success rate will be very useful. We have developed a new web-based pipeline, called MSP-HTPrimer, to design primers pairs for MSP, BSP, pyrosequencing, COBRA, and MSRE assays on both genomic strands. First, our pipeline converts all target sequences into bisulfite-treated templates for both forward and reverse strand and designs all possible primer pairs, followed by filtering for single nucleotide polymorphisms (SNPs) and known repeat regions. Next, each primer pairs are annotated with the upstream and downstream RefSeq genes, CpG island, and cut sites (for COBRA and MSRE). Finally, MSP-HTPrimer selects specific primers from both strands based on custom and user-defined hierarchical selection criteria. MSP-HTPrimer produces a primer pair summary output table in TXT and HTML format for display and UCSC custom tracks for resulting primer pairs in GTF format. MSP-HTPrimer is an integrated, web-based, and high-throughput pipeline and has no limitation on the number and size of target sequences and designs MSP, BSP, pyrosequencing, COBRA, and MSRE assays. It is the only pipeline, which automatically designs primers on both genomic

  5. pBaSysBioll : an integrative plasmid generating gfp transcriptional fusions for high-throughput analysis of gene expression in Bacillus subtilis

    NARCIS (Netherlands)

    Botella, Eric; Fogg, Mark; Jules, Matthieu; Piersma, Sjouke; Doherty, Geoff; Hansen, Annette; Denham, Emma. L.; Le Chat, Ludovic; Veiga, Patrick; Bailey, Kirra; Lewis, Peter J.; van Dijl, Jan Maarten; Aymerich, Stephane; Wilkinson, Anthony J.; Devine, Kevin M.

    Plasmid pBaSysBioll was constructed for high-throughput analysis of gene expression in Bacillus subtilis. It is an integrative plasmid with a ligation-independent cloning (LIC) site, allowing the generation of transcriptional gfpmut3 fusions with desired promoters. Integration is by a Campbell-type

  6. pBaSysBioll : an integrative plasmid generating gfp transcriptional fusions for high-throughput analysis of gene expression in Bacillus subtilis

    NARCIS (Netherlands)

    Botella, Eric; Fogg, Mark; Jules, Matthieu; Piersma, Sjouke; Doherty, Geoff; Hansen, Annette; Denham, Emma. L.; Le Chat, Ludovic; Veiga, Patrick; Bailey, Kirra; Lewis, Peter J.; van Dijl, Jan Maarten; Aymerich, Stephane; Wilkinson, Anthony J.; Devine, Kevin M.

    2010-01-01

    Plasmid pBaSysBioll was constructed for high-throughput analysis of gene expression in Bacillus subtilis. It is an integrative plasmid with a ligation-independent cloning (LIC) site, allowing the generation of transcriptional gfpmut3 fusions with desired promoters. Integration is by a Campbell-type

  7. High-Throughput Analysis of Enzymatic Hydrolysis of Biodegradable Polyesters by Monitoring Cohydrolysis of a Polyester-Embedded Fluorogenic Probe.

    Science.gov (United States)

    Zumstein, Michael Thomas; Kohler, Hans-Peter E; McNeill, Kristopher; Sander, Michael

    2017-02-14

    Biodegradable polyesters have the potential to replace nondegradable, persistent polymers in numerous applications and thereby alleviate plastic accumulation in the environment. Herein, we present an analytical approach to study enzymatic hydrolysis of polyesters, the key step in their overall biodegradation process. The approach is based on embedding fluorescein dilaurate (FDL), a fluorogenic ester substrate, into the polyester matrix and on monitoring the enzymatic cohydrolysis of FDL to fluorescein during enzymatic hydrolysis of the polyester. We validated the approach against established techniques using FDL-containing poly(butylene adipate) films and Fusarium solani cutinase (FsC). Implemented on a microplate reader platform, the FDL-based approach enabled sensitive and high-throughput analysis of the enzymatic hydrolysis of eight aliphatic polyesters by two fungal esterases (FsC and Rhizopus oryzae lipase) at different temperatures. While hydrolysis rates for both enzymes increased with decreasing differences between the polyester melting temperatures and the experimental temperatures, this trend was more pronounced for the lipase than the cutinase. These trends in rates could be ascribed to a combination of temperature-dependent polyester chain flexibility and accessibility of the enzyme active site. The work highlights the capability of the FDL-based approach to be utilized in both screening and mechanistic studies of enzymatic polyester hydrolysis.

  8. High throughput multiple locus variable number of tandem repeat analysis (MLVA) of Staphylococcus aureus from human, animal and food sources.

    Science.gov (United States)

    Sobral, Daniel; Schwarz, Stefan; Bergonier, Dominique; Brisabois, Anne; Feßler, Andrea T; Gilbert, Florence B; Kadlec, Kristina; Lebeau, Benoit; Loisy-Hamon, Fabienne; Treilles, Michaël; Pourcel, Christine; Vergnaud, Gilles

    2012-01-01

    Staphylococcus aureus is a major human pathogen, a relevant pathogen in veterinary medicine, and a major cause of food poisoning. Epidemiological investigation tools are needed to establish surveillance of S. aureus strains in humans, animals and food. In this study, we investigated 145 S. aureus isolates recovered from various animal species, disease conditions, food products and food poisoning events. Multiple Locus Variable Number of Tandem Repeat (VNTR) analysis (MLVA), known to be highly efficient for the genotyping of human S. aureus isolates, was used and shown to be equally well suited for the typing of animal S. aureus isolates. MLVA was improved by using sixteen VNTR loci amplified in two multiplex PCRs and analyzed by capillary electrophoresis ensuring a high throughput and high discriminatory power. The isolates were assigned to twelve known clonal complexes (CCs) and--a few singletons. Half of the test collection belonged to four CCs (CC9, CC97, CC133, CC398) previously described as mostly associated with animals. The remaining eight CCs (CC1, CC5, CC8, CC15, CC25, CC30, CC45, CC51), representing 46% of the animal isolates, are common in humans. Interestingly, isolates responsible for food poisoning show a CC distribution signature typical of human isolates and strikingly different from animal isolates, suggesting a predominantly human origin.

  9. High throughput multiple locus variable number of tandem repeat analysis (MLVA of Staphylococcus aureus from human, animal and food sources.

    Directory of Open Access Journals (Sweden)

    Daniel Sobral

    Full Text Available Staphylococcus aureus is a major human pathogen, a relevant pathogen in veterinary medicine, and a major cause of food poisoning. Epidemiological investigation tools are needed to establish surveillance of S. aureus strains in humans, animals and food. In this study, we investigated 145 S. aureus isolates recovered from various animal species, disease conditions, food products and food poisoning events. Multiple Locus Variable Number of Tandem Repeat (VNTR analysis (MLVA, known to be highly efficient for the genotyping of human S. aureus isolates, was used and shown to be equally well suited for the typing of animal S. aureus isolates. MLVA was improved by using sixteen VNTR loci amplified in two multiplex PCRs and analyzed by capillary electrophoresis ensuring a high throughput and high discriminatory power. The isolates were assigned to twelve known clonal complexes (CCs and--a few singletons. Half of the test collection belonged to four CCs (CC9, CC97, CC133, CC398 previously described as mostly associated with animals. The remaining eight CCs (CC1, CC5, CC8, CC15, CC25, CC30, CC45, CC51, representing 46% of the animal isolates, are common in humans. Interestingly, isolates responsible for food poisoning show a CC distribution signature typical of human isolates and strikingly different from animal isolates, suggesting a predominantly human origin.

  10. High-throughput analysis of 19 endogenous androgenic steroids by ultra-performance convergence chromatography tandem mass spectrometry.

    Science.gov (United States)

    Quanson, Jonathan L; Stander, Marietjie A; Pretorius, Elzette; Jenkinson, Carl; Taylor, Angela E; Storbeck, Karl-Heinz

    2016-09-15

    11-Oxygenated steroids such as 11-ketotestosterone and 11-ketodihydrotestosterone have recently been shown to play a putative role in the development and progression of castration resistant prostate cancer. In this study we report on the development of a high throughput ultra-performance convergence chromatography tandem mass spectrometry (UPC(2)-MS/MS) method for the analysis of thirteen 11-oxygenated and six canonical C19 steroids isolated from a cell culture matrix. Using an Acquity UPC(2) BEH 2-EP column we found that UPC(2) resulted in superior selectivity, increased chromatographic efficiency and a scattered elution order when compared to conventional reverse phase ultra-performance liquid chromatography (UPLC). Furthermore, there was a significant improvement in sensitivity (5-50 times). The lower limits of quantification ranged between 0.01-10ngmL(-1), while the upper limit of quantification was 100ngmL(-1) for all steroids. Accuracy, precision, intra-day variation, recovery, matrix effects and process efficiency were all evaluated and found to be within acceptable limits. Taken together we show that the increased power of UPC(2)-MS/MS allows the analyst to complete in vitro assays at biologically relevant concentrations for the first time and in so doing determine the routes of steroid metabolism which is vital for studies of androgen responsive cancers, such as prostate cancer, and could highlight new mechanisms of disease progression and new targets for cancer therapy. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. In situ analysis and structural elucidation of sainfoin (Onobrychis viciifolia) tannins for high-throughput germplasm screening.

    Science.gov (United States)

    Gea, An; Stringano, Elisabetta; Brown, Ron H; Mueller-Harvey, Irene

    2011-01-26

    A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin ( Onobrychis viciifolia ) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6-113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.

  12. Very fast capillary electrophoresis with electrochemical detection for high-throughput analysis using short, vertically aligned capillaries.

    Science.gov (United States)

    Mark, Jonas Josef Peter; Piccinelli, Paolo; Matysik, Frank-Michael

    2014-09-01

    A method for conducting fast and efficient capillary electrophoresis (CE) based on short separation capillaries in vertical alignment was developed. The strategy enables for high-throughput analysis from small sample vials (low microliter to nanoliter range). The system consists of a lab-made miniaturized autosampling unit and an amperometric end-column detection (AD) cell. The device enables a throughput of up to 200 separations per hour. CE-AD separations of a dye model system in capillaries of only 4 to 7.5 cm length with inner diameters (ID) of 10 or 15 μm were carried out under conditions of very high electric field strengths (up to 3.0 kV/cm) with high separation efficiency (half peak widths below 0.2 s) in less than 3.5 s migration time. A non-aqueous background electrolyte, consisting of 10 mM ammonium acetate and 1 M acetic acid in acetonitrile, was used. The practical suitability of the system was evaluated by applying it to the determination of dyes in overhead projector pens.

  13. [High-throughput analysis of bacterial community of transition zone in littoral wetland of Wuliangsuhai eutrophic lake].

    Science.gov (United States)

    Li, Jingyu; Du, Ruifang; Zhao, Ji

    2015-05-04

    We studied soil bacterial community composition, abundance and diversity of transition zone along eutrophic lakeside wetland sediments and soils. The total DNA was extracted according to the sediment DNA extraction. Then high-throughput pyrosequencing was used to detect soil bacterial community composition,abundance and diversity based-on 16S rRNA gene. Soil physicochemical properties were tested to analyze its effects on bacterial community according to standard methods. The soil bacterial community composition and relative abundance were very different across transition zone in littoral wetland. Bacteria groups mainly include Proteobacteria, Bacteroidetes, Chloroflexi, Actinobacteria, Planctomycetes and Gemmatimonadetes at phylum level. The diversity index of bacterial communities gradually increased according the land distribution, especially the phylum Proteobacteria and the genus Sulfurimonas. Correlation analysis indicated that the combination of total phosphorus, total water soluble salt and ammonium has the most significant effects on the whole bacterial community structure, and Mantel Test results indicated that the correlation was statistically significant (R = 0.8857, P = 0.037). The bacterial community structure of transition zone is quite different in littoral wetland of Wuliangsuhai eutrophic lake, where Sulfurimonas play potential important roles in biogeochemical cycles of sediments in Wuliangsuhai Lake.

  14. A high-throughput, simultaneous analysis of carotenoids, chlorophylls and tocopherol using sub two micron core shell technology columns.

    Science.gov (United States)

    Chebrolu, Kranthi K; Yousef, Gad G; Park, Ryan; Tanimura, Yoshinori; Brown, Allan F

    2015-09-15

    A high-throughput, robust and reliable method for simultaneous analysis of five carotenoids, four chlorophylls and one tocopherol was developed for rapid screening large sample populations to facilitate molecular biology and plant breeding. Separation was achieved for 10 known analytes and four unknown carotenoids in a significantly reduced run time of 10min. Identity of the 10 analytes was confirmed by their UV-Vis absorption spectras. Quantification of tocopherol, carotenoids and chlorophylls was performed at 290nm, 460nm and 650nm respectively. In this report, two sub two micron particle core-shell columns, Kinetex from Phenomenex (1.7μm particle size, 12% carbon load) and Cortecs from Waters (1.6μm particle size, 6.6% carbon load) were investigated and their separation efficiencies were evaluated. The peak resolutions were >1.5 for all analytes except for chlorophyll-a' with Cortecs column. The ruggedness of this method was evaluated in two identical but separate instruments that produced CV<2 in peak retentions for nine out of 10 analytes separated.

  15. High-throughput metagenomic analysis of petroleum-contaminated soil microbiome reveals the versatility in xenobiotic aromatics metabolism.

    Science.gov (United States)

    Bao, Yun-Juan; Xu, Zixiang; Li, Yang; Yao, Zhi; Sun, Jibin; Song, Hui

    2017-06-01

    The soil with petroleum contamination is one of the most studied soil ecosystems due to its rich microorganisms for hydrocarbon degradation and broad applications in bioremediation. However, our understanding of the genomic properties and functional traits of the soil microbiome is limited. In this study, we used high-throughput metagenomic sequencing to comprehensively study the microbial community from petroleum-contaminated soils near Tianjin Dagang oilfield in eastern China. The analysis reveals that the soil metagenome is characterized by high level of community diversity and metabolic versatility. The metageome community is predominated by γ-Proteobacteria and α-Proteobacteria, which are key players for petroleum hydrocarbon degradation. The functional study demonstrates over-represented enzyme groups and pathways involved in degradation of a broad set of xenobiotic aromatic compounds, including toluene, xylene, chlorobenzoate, aminobenzoate, DDT, methylnaphthalene, and bisphenol. A composite metabolic network is proposed for the identified pathways, thus consolidating our identification of the pathways. The overall data demonstrated the great potential of the studied soil microbiome in the xenobiotic aromatics degradation. The results not only establish a rich reservoir for novel enzyme discovery but also provide putative applications in bioremediation. Copyright © 2016. Published by Elsevier B.V.

  16. Systematic analysis of protein subcellular localization and interaction using high-throughput transient transformation of Arabidopsis seedlings.

    Science.gov (United States)

    Marion, Jessica; Bach, Lien; Bellec, Yannick; Meyer, Christian; Gissot, Lionel; Faure, Jean-Denis

    2008-10-01

    The functional genomics approach requires systematic analysis of protein subcellular distribution and interaction networks, preferably by optimizing experimental simplicity and physiological significance. Here, we present an efficient in planta transient transformation system that allows single or multiple expression of constructs containing various fluorescent protein tags in Arabidopsis cotyledons. The optimized protocol is based on vacuum infiltration of agrobacteria directly into young Arabidopsis seedlings. We demonstrate that Arabidopsis epidermal cells show a subcellular distribution of reference markers similar to that in tobacco epidermal cells, and can be used for co-localization or bi-molecular fluorescent complementation studies. We then used this new system to investigate the subcellular distribution of enzymes involved in sphingolipid metabolism. In contrast to transformation systems using tobacco epidermal cells or cultured Arabidopsis cells, our system provides the opportunity to take advantage of the extensive collections of mutant and transgenic lines available in Arabidopsis. The fact that this assay uses conventional binary vectors and a conventional Agrobacterium strain, and is compatible with a large variety of fluorescent tags, makes it a versatile tool for construct screening and characterization before stable transformation. Transient expression in Arabidopsis seedlings is thus a fast and simple method that requires minimum handling and potentially allows medium- to high-throughput analyses of fusion proteins harboring fluorescent tags in a whole-plant cellular context.

  17. Cost Effectiveness Analysis of Optimal Malaria Control Strategies in Kenya

    Directory of Open Access Journals (Sweden)

    Gabriel Otieno

    2016-03-01

    Full Text Available Malaria remains a leading cause of mortality and morbidity among the children under five and pregnant women in sub-Saharan Africa, but it is preventable and controllable provided current recommended interventions are properly implemented. Better utilization of malaria intervention strategies will ensure the gain for the value for money and producing health improvements in the most cost effective way. The purpose of the value for money drive is to develop a better understanding (and better articulation of costs and results so that more informed, evidence-based choices could be made. Cost effectiveness analysis is carried out to inform decision makers on how to determine where to allocate resources for malaria interventions. This study carries out cost effective analysis of one or all possible combinations of the optimal malaria control strategies (Insecticide Treated Bednets—ITNs, Treatment, Indoor Residual Spray—IRS and Intermittent Preventive Treatment for Pregnant Women—IPTp for the four different transmission settings in order to assess the extent to which the intervention strategies are beneficial and cost effective. For the four different transmission settings in Kenya the optimal solution for the 15 strategies and their associated effectiveness are computed. Cost-effective analysis using Incremental Cost Effectiveness Ratio (ICER was done after ranking the strategies in order of the increasing effectiveness (total infections averted. The findings shows that for the endemic regions the combination of ITNs, IRS, and IPTp was the most cost-effective of all the combined strategies developed in this study for malaria disease control and prevention; for the epidemic prone areas is the combination of the treatment and IRS; for seasonal areas is the use of ITNs plus treatment; and for the low risk areas is the use of treatment only. Malaria transmission in Kenya can be minimized through tailor-made intervention strategies for malaria control

  18. Live Cell Bioluminescence Imaging in Temporal Reaction of G Protein-Coupled Receptor for High-Throughput Screening and Analysis.

    Science.gov (United States)

    Hattori, Mitsuru; Ozawa, Takeaki

    2016-01-01

    G protein-coupled receptors (GPCRs) are notable targets of basic therapeutics. Many screening methods have been established to identify novel agents for GPCR signaling in a high-throughput manner. However, information related to the temporal reaction of GPCR with specific ligands remains poor. We recently developed a bioluminescence method for the quantitative detection of the interaction between GPCR and β-arrestin using split luciferase complementation. To monitor time-course variation of the interactions, a new imaging system contributes to the accurate evaluation of drugs for GPCRs in a high-throughput manner.

  19. Identification and characterization of microRNAs in Eucheuma denticulatum by high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Gao, Fan; Nan, Fangru; Feng, Jia; Lv, Junping; Liu, Qi; Xie, Shulian

    2016-01-01

    Eucheuma denticulatum, an economically and industrially important red alga, is a valuable marine resource. Although microRNAs (miRNAs) play an essential role in gene post-transcriptional regulation, no research has been conducted to identify and characterize miRNAs in E. denticulatum. In this study, we identified 134 miRNAs (133 conserved miRNAs and one novel miRNA) from 2,997,135 small-RNA reads by high-throughput sequencing combined with bioinformatics analysis. BLAST searching against miRBase uncovered 126 potential miRNA families. A conservation and diversity analysis of predicted miRNA families in different plant species was performed by comparative alignment and homology searching. A total of 4 and 13 randomly selected miRNAs were respectively validated by northern blotting and stem-loop reverse transcription PCR, thereby demonstrating the reliability of the miRNA sequencing data. Altogether, 871 potential target genes were predicted using psRobot and TargetFinder. Target genes classification and enrichment were conducted based on Gene Ontology analysis. The functions of target gene products and associated metabolic pathways were predicted by Kyoto Encyclopedia of Genes and Genomes pathway analysis. A Cytoscape network was constructed to explore the interrelationships of miRNAs, miRNA-target genes and target genes. A large number of miRNAs with diverse target genes will play important roles for further understanding some essential biological processes in E. denticulatum. The uncovered information can serve as an important reference for the protection and utilization of this unique red alga in the future.

  20. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yonghua [Iowa State Univ., Ames, IA (United States)

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  1. Metabolic profiling of recombinant Escherichia coli cultivations based on high-throughput FT-MIR spectroscopic analysis.

    Science.gov (United States)

    Sales, Kevin C; Rosa, Filipa; Cunha, Bernardo R; Sampaio, Pedro N; Lopes, Marta B; Calado, Cecília R C

    2016-10-03

    Escherichia coli is one of the most used host microorganism for the production of recombinant products, such as heterologous proteins and plasmids. However, genetic, physiological and environmental factors influence the plasmid replication and cloned gene expression in a highly complex way. To control and optimize the recombinant expression system performance, it is very important to understand this complexity. Therefore, the development of rapid, highly sensitive and economic analytical methodologies, which enable the simultaneous characterization of the heterologous product synthesis and physiologic cell behavior under a variety of culture conditions, is highly desirable. For that, the metabolic profile of recombinant E. coli cultures producing the pVAX-lacZ plasmid model was analyzed by rapid, economic and high-throughput Fourier Transform Mid-Infrared (FT-MIR) spectroscopy. The main goal of the present work is to show as the simultaneous multivariate data analysis by principal component analysis (PCA) and direct spectral analysis could represent a very interesting tool to monitor E. coli culture processes and acquire relevant information according to current quality regulatory guidelines. While PCA allowed capturing the energetic metabolic state of the cell, e.g. by identifying different C-sources consumption phases, direct FT-MIR spectral analysis allowed obtaining valuable biochemical and metabolic information along the cell culture, e.g. lipids, RNA, protein synthesis and turnover metabolism. The information achieved by spectral multivariate data and direct spectral analyses complement each other and may contribute to understand the complex interrelationships between the recombinant cell metabolism and the bioprocess environment towards more economic and robust processes design according to Quality by Design framework. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 2016.

  2. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    Energy Technology Data Exchange (ETDEWEB)

    Yonghua Zhang

    2002-05-27

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  3. High-throughput analysis for preparation, processing and analysis of TiO{sub 2} coatings on steel by chemical solution deposition

    Energy Technology Data Exchange (ETDEWEB)

    Cuadrado Gil, Marcos, E-mail: Marcos.cuadradogil@ugent.be [SCRIPTS - Department of Inorganic and Physical Chemistry, Ghent University, Krijgslaan 281 (S3) (Belgium); Van Driessche, Isabel, E-mail: Isabel.VanDriessche@ugent.be [SCRIPTS - Department of Inorganic and Physical Chemistry, Ghent University, Krijgslaan 281 (S3) (Belgium); Van Gils, Sake, E-mail: Sake.Vangils@arcelormittal.com [OCAS - ArcelorMittal Gent R and D Centre, Pres. J.F. Kennedylaan 3, Zelzate B-9060 (Belgium); Lommens, Petra, E-mail: Petra.Lommens@ugent.be [SCRIPTS - Department of Inorganic and Physical Chemistry, Ghent University, Krijgslaan 281 (S3) (Belgium); Castelein, Pieter, E-mail: Pieter.Castelein@flamac.be [Flamac - A Division of SIM, Technologiepark 903, Zwijnaarde 9052 (Belgium); De Buysser, Klaartje, E-mail: Klaartje.DeBuysser@ugent.be [SCRIPTS - Department of Inorganic and Physical Chemistry, Ghent University, Krijgslaan 281 (S3) (Belgium)

    2012-11-05

    Highlights: Black-Right-Pointing-Pointer High-throughput preparation of TiO{sub 2} aqueous precursors. Black-Right-Pointing-Pointer Analysis of stability and surface tension. Black-Right-Pointing-Pointer Deposition of TiO{sub 2} coatings. - Abstract: A high-throughput preparation, processing and analysis of titania coatings prepared by chemical solution deposition from water-based precursors at low temperature ( Almost-Equal-To 250 Degree-Sign C) on two different types of steel substrates (Aluzinc Registered-Sign and bright annealed) is presented. The use of the high-throughput equipment allows fast preparation of multiple samples saving time, energy and material; and helps to test the scalability of the process. The process itself includes the use of IR curing for aqueous ceramic precursors and possibilities of using UV irradiation before the final sintering step. The IR curing method permits a much faster curing step compared to normal high temperature treatments in traditional convection devices (i.e., tube furnaces). The formulations, also prepared by high-throughput equipment, are found to be stable in the operational pH range of the substrates (6.5-8.5). Titanium alkoxides itself lack stability in pure water-based environments, but the presence of the different organic complexing agents prevents it from hydrolysis and precipitation reactions. The wetting interaction between the substrates and the various formulations is studied by the determination of the surface free energy of the substrates and the polar and dispersive components of the surface tension of the solutions. The mild temperature program used for preparation of the coatings however does not lead to the formation of pure crystalline material, necessary for the desired photocatalytic and super-hydrophilic behavior of these coatings. Nevertheless, some activity can be reported for these amorphous coatings by monitoring the discoloration of methylene blue in water under UV irradiation.

  4. Plant seed species identification from chemical fingerprints: a high-throughput application of direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Lesiak, Ashton D; Cody, Robert B; Dane, A John; Musah, Rabi A

    2015-09-01

    Plant species identification based on the morphological features of plant parts is a well-established science in botany. However, species identification from seeds has largely been unexplored, despite the fact that the seeds contain all of the genetic information that distinguishes one plant from another. Using seeds of genus Datura plants, we show here that the mass spectrum-derived chemical fingerprints for seeds of the same species are similar. On the other hand, seeds from different species within the same genus display distinct chemical signatures, even though they may contain similar characteristic biomarkers. The intraspecies chemical signature similarities on the one hand, and interspecies fingerprint differences on the other, can be processed by multivariate statistical analysis methods to enable rapid species-level identification and differentiation. The chemical fingerprints can be acquired rapidly and in a high-throughput manner by direct analysis in real time mass spectrometry (DART-MS) analysis of the seeds in their native form, without use of a solvent extract. Importantly, knowledge of the identity of the detected molecules is not required for species level identification. However, confirmation of the presence within the seeds of various characteristic tropane and other alkaloids, including atropine, scopolamine, scopoline, tropine, tropinone, and tyramine, was accomplished by comparison of the in-source collision-induced dissociation (CID) fragmentation patterns of authentic standards, to the fragmentation patterns observed in the seeds when analyzed under similar in-source CID conditions. The advantages, applications, and implications of the chemometric processing of DART-MS derived seed chemical signatures for species level identification and differentiation are discussed.

  5. An economical and effective high-throughput DNA extraction protocol for molecular marker analysis in honey bees

    Science.gov (United States)

    Extraction of DNA from tissue samples can be expensive both in time and monetary resources and can often require handling and disposal of hazardous chemicals. We have developed a high throughput protocol for extracting DNA from honey bees that is of a high enough quality and quantity to enable hundr...

  6. Intestinal microbiota in healthy U.S. young children and adults--a high throughput microarray analysis.

    Directory of Open Access Journals (Sweden)

    Tamar Ringel-Kulka

    Full Text Available It is generally believed that the infant's microbiota is established during the first 1-2 years of life. However, there is scarce data on its characterization and its comparison to the adult-like microbiota in consecutive years.To characterize and compare the intestinal microbiota in healthy young children (1-4 years and healthy adults from the North Carolina region in the U.S. using high-throughput bacterial phylogenetic microarray analysis.Detailed characterization and comparison of the intestinal microbiota of healthy children aged 1-4 years old (n = 28 and healthy adults of 21-60 years (n = 23 was carried out using the Human Intestinal Tract Chip (HITChip phylogenetic microarray targeting the V1 and V6 regions of 16S rRNA and quantitative PCR.The HITChip microarray data indicate that Actinobacteria, Bacilli, Clostridium cluster IV and Bacteroidetes are the predominant phylum-like groups that exhibit differences between young children and adults. The phylum-like group Clostridium cluster XIVa was equally predominant in young children and adults and is thus considered to be established at an early age. The genus-like level show significant 3.6 fold (higher or lower differences in the abundance of 26 genera between young children and adults. Young U.S. children have a significantly 3.5-fold higher abundance of Bifidobacterium species than the adults from the same location. However, the microbiota of young children is less diverse than that of adults.We show that the establishment of an adult-like intestinal microbiota occurs at a later age than previously reported. Characterizing the microbiota and its development in the early years of life may help identify 'windows of opportunity' for interventional strategies that may promote health and prevent or mitigate disease processes.

  7. Metabolomic and high-throughput sequencing analysis – modern approach for the assessment of biodeterioration of materials from historic buildings

    Directory of Open Access Journals (Sweden)

    Beata eGutarowska

    2015-09-01

    Full Text Available Preservation of cultural heritage is of paramount importance worldwide. Microbial colonization of construction materials, such as wood, brick, mortar and stone in historic buildings can lead to severe deterioration. The aim of the present study was to give modern insight into the phylogenetic diversity and activated metabolic pathways of microbial communities colonized historic objects located in the former Auschwitz II-Birkenau concentration and extermination camp in Oświęcim, Poland. For this purpose we combined molecular, microscopic and chemical methods. Selected specimens were examined using Field Emission Scanning Electron Microscopy (FESEM, metabolomic analysis and high-throughput Illumina sequencing. FESEM imaging revealed the presence of complex microbial communities comprising diatoms, fungi and bacteria, mainly cyanobacteria and actinobacteria, on sample surfaces. Microbial diversity of brick specimens appeared higher than that of the wood and was dominated by algae and cyanobacteria, while wood was mainly colonized by fungi. DNA sequences documented the presence of 15 bacterial phyla representing 99 genera including Halomonas, Halorhodospira, Salinisphaera, Salinibacterium, Rubrobacter, Streptomyces, Arthrobacter and 9 fungal classes represented by 113 genera including Cladosporium, Acremonium, Alternaria, Engyodontium, Penicillium, Rhizopus and Aureobasidium. Most of the identified sequences were characteristic of organisms implicated in deterioration of wood and brick. Metabolomic data indicated the activation of numerous metabolic pathways, including those regulating the production of primary and secondary metabolites, for example, metabolites associated with the production of antibiotics, organic acids and deterioration of organic compounds. The study demonstrated that a combination of electron microscopy imaging with metabolomic and genomic techniques allows to link the phylogenetic information and metabolic profiles of

  8. Methodological considerations in the analysis of cost effectiveness in dentistry.

    Science.gov (United States)

    Antczak-Bouckoms, A A; Tulloch, J F; White, B A; Capilouto, E I

    1989-01-01

    Cost-effectiveness analysis is a technique applied with increasing frequency to help make rational decisions in health care resource allocation. This article reviews the ten general principles of cost-effectiveness analysis outlined by the Office of Technology Assessment of the US Congress and describes a model for such analyses used widely in medicine, but only recently applied in dentistry. The imperative for the formulation of the best current information on both the effectiveness of dental practices and their costs is made more urgent because of the now universally recognized belief that resources available to meet the demands for health care are limited. Today's environment requires critical allocation decisions within categorical health problems, across diseases, or relative to other health problems. If important health benefits or cost savings are to be realized, then these analytic approaches must become widely understood, accepted, and appropriately applied by key decision makers in the dental health sector.

  9. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    2016-01-01

    Biogas production is an economically attractive technology that has gained momentum worldwide over the past years. Biogas is produced by a biologically mediated process, widely known as "anaerobic digestion." This process is performed by a specialized and complex microbial community, in which...... dissect the bioma involved in anaerobic digestion by means of high throughput Illumina sequencing (~51 gigabases of sequence data), disclosing nearly one million genes and extracting 106 microbial genomes by a novel strategy combining two binning processes. Microbial phylogeny and putative taxonomy...... on the phylogenetic and functional characterization of the microbial community populating biogas reactors. By applying for the first time high-throughput sequencing and a novel binning strategy, the identified genes were anchored to single genomes providing a clear understanding of their metabolic pathways...

  10. High-throughput DNA sequence analysis reveals stable engraftment of gut microbiota following transplantation of previously frozen fecal bacteria

    OpenAIRE

    Hamilton, Matthew J.; Weingarden, Alexa R.; Unno, Tatsuya; Khoruts, Alexander; Michael J Sadowsky

    2013-01-01

    Fecal microbiota transplantation (FMT) is becoming a more widely used technology for treatment of recurrent Clostridum difficile infection (CDI). While previous treatments used fresh fecal slurries as a source of microbiota for FMT, we recently reported the successful use of standardized, partially purified and frozen fecal microbiota to treat CDI. Here we report that high-throughput 16S rRNA gene sequencing showed stable engraftment of gut microbiota following FMT using frozen fecal bacteria...

  11. Cytomegalovirus Destruction of Focal Adhesions Revealed in a High-Throughput Western Blot Analysis of Cellular Protein Expression† ▿

    OpenAIRE

    Stanton, Richard James; McSharry, Brian Patrick; Rickards, Carole Ruth; Wang, Edward Chung Yern; Tomasec, Peter; Wilkinson, Gavin William Grahame

    2007-01-01

    Human cytomegalovirus (HCMV) systematically manages the expression of cellular functions, rather than exerting the global shutoff of host cell protein synthesis commonly observed with other herpesviruses during the lytic cycle. While microarray technology has provided remarkable insights into viral control of the cellular transcriptome, HCMV is known to encode multiple mechanisms for posttranscriptional and posttranslation regulation of cellular gene expression. High-throughput Western blotti...

  12. High-Throughput Sequencing and Metagenomics: Moving Forward in the Culture-Independent Analysis of Food Microbial Ecology

    OpenAIRE

    Ercolini, Danilo

    2013-01-01

    Following recent trends in environmental microbiology, food microbiology has benefited from the advances in molecular biology and adopted novel strategies to detect, identify, and monitor microbes in food. An in-depth study of the microbial diversity in food can now be achieved by using high-throughput sequencing (HTS) approaches after direct nucleic acid extraction from the sample to be studied. In this review, the workflow of applying culture-independent HTS to food matrices is described. T...

  13. Droplet-based light-sheet fluorescence microscopy for high-throughput sample preparation, 3-D imaging and quantitative analysis on a chip.

    Science.gov (United States)

    Jiang, Hao; Zhu, Tingting; Zhang, Hao; Nie, Jun; Guan, Zeyi; Ho, Chi-Ming; Liu, Sheng; Fei, Peng

    2017-06-27

    We report a novel fusion of droplet microfluidics and light-sheet microscopy, to achieve high-throughput sample compartmentalization, manipulation and three-dimensional imaging on a chip. This optofluidic device characterized by orthogonal plane illumination and rapid liquid handling is compact and cost-effective, and capable of preparing sample droplets with tunable size, frequency and ingredient. Each droplet flowing through the device's imaging region is self-scanned by a laser-sheet, three-dimensionally reconstructed and quantitatively analysed. This simple-and-robust platform combines fast 3-D imaging with efficient sample preparation and eliminates the need of a complicated mechanical scan at the same time. Achieving 500 measurements per second and screening over 30 samples per minute, it shows great potential for various lab-on-a-chip biological studies, such as embryo sorting and cell growth assays.

  14. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hui Su

    2001-05-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm{sub 2} for 40-{micro}m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  15. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  16. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection.

    Directory of Open Access Journals (Sweden)

    Priya Choudhry

    Full Text Available Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays.

  17. Cost-effectiveness analysis in Chagas' disease vectors control interventions

    Directory of Open Access Journals (Sweden)

    A. M. Oliveira Filho

    1989-01-01

    Full Text Available After a large scale field trial performed in central Brazil envisaging the control of Chagas' disease vectors in an endemic area colonized by Triatoma infestans and T. sordida the cost-effectiveness analysis for each insecticide/formulation was performed. It considered the operational costs and the prices of insecticides and formulations, related to the activity and persistence of each one. The end point was considered to be less than 90% of domicilliary unitis (house + annexes free of infestation. The results showed good cost-effectiveness for a slow-release emulsifiable suspension (SRES based on PVA and containing malathion as active ingredient, as well as for the pyrethroids tested in this assay-cyfluthrin, cypermethrin, deltamethrin and permethrin.

  18. Power and sample size in cost-effectiveness analysis.

    Science.gov (United States)

    Laska, E M; Meisner, M; Siegel, C

    1999-01-01

    For resource allocation under a constrained budget, optimal decision rules for mutually exclusive programs require that the treatment with the highest incremental cost-effectiveness ratio (ICER) below a willingness-to-pay (WTP) criterion be funded. This is equivalent to determining the treatment with the smallest net health cost. The designer of a cost-effectiveness study needs to select a sample size so that the power to reject the null hypothesis, the equality of the net health costs of two treatments, is high. A recently published formula derived under normal distribution theory overstates sample-size requirements. Using net health costs, the authors present simple methods for power analysis based on conventional normal and on nonparametric statistical theory.

  19. Above Bonneville Passage and Propagation Cost Effectiveness Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, C.M.; Hyman, J.B.; Wernstedt, K.

    1993-05-01

    We have developed several models to evaluate the cost-effectiveness of alternative strategies to mitigate hydrosystem impacts on salmon and steelhead, and applied these models to areas of the Columbia River Basin. Our latest application evaluates the cost-effectiveness of proposed strategies that target mainstem survival (e.g., predator control, increases in water velocity) and subbasin propagation (e.g., habitat improvements, screening, hatchery production increases) for chinook salmon and steelhead stocks, in the portion of the Columbia Basin bounded by Bonneville, Chief Joseph, Dworshak, and Hells Canyon darns. At its core the analysis primarily considers financial cost and biological effectiveness, but we have included other attributes which may be of concern to the region.

  20. Identification of microRNAs in the Toxigenic Dinoflagellate Alexandrium catenella by High-Throughput Illumina Sequencing and Bioinformatic Analysis.

    Directory of Open Access Journals (Sweden)

    Huili Geng

    Full Text Available Micro-ribonucleic acids (miRNAs are a large group of endogenous, tiny, non-coding RNAs consisting of 19-25 nucleotides that regulate gene expression at either the transcriptional or post-transcriptional level by mediating gene silencing in eukaryotes. They are considered to be important regulators that affect growth, development, and response to various stresses in plants. Alexandrium catenella is an important marine toxic phytoplankton species that can cause harmful algal blooms (HABs. To date, identification and function analysis of miRNAs in A. catenella remain largely unexamined. In this study, high-throughput sequencing was performed on A. catenella to identify and quantitatively profile the repertoire of small RNAs from two different growth phases. A total of 38,092,056 and 32,969,156 raw reads were obtained from the two small RNA libraries, respectively. In total, 88 mature miRNAs belonging to 32 miRNA families were identified. Significant differences were found in the member number, expression level of various families, and expression abundance of each member within a family. A total of 15 potentially novel miRNAs were identified. Comparative profiling showed that 12 known miRNAs exhibited differential expression between the lag phase and the logarithmic phase. Real-time quantitative RT-PCR (qPCR was performed to confirm the expression of two differentially expressed miRNAs that were one up-regulated novel miRNA (aca-miR-3p-456915, and one down-regulated conserved miRNA (tae-miR159a. The expression trend of the qPCR assay was generally consistent with the deep sequencing result. Target predictions of the 12 differentially expressed miRNAs resulted in 1813 target genes. Gene ontology (GO analysis and the Kyoto Encyclopedia of Genes and Genomes pathway database (KEGG annotations revealed that some miRNAs were associated with growth and developmental processes of the alga. These results provide insights into the roles that miRNAs play in

  1. High-throughput analysis of candidate imprinted genes and allele-specific gene expression in the human term placenta

    Directory of Open Access Journals (Sweden)

    Clark Taane G

    2010-04-01

    Full Text Available Abstract Background Imprinted genes show expression from one parental allele only and are important for development and behaviour. This extreme mode of allelic imbalance has been described for approximately 56 human genes. Imprinting status is often disrupted in cancer and dysmorphic syndromes. More subtle variation of gene expression, that is not parent-of-origin specific, termed 'allele-specific gene expression' (ASE is more common and may give rise to milder phenotypic differences. Using two allele-specific high-throughput technologies alongside bioinformatics predictions, normal term human placenta was screened to find new imprinted genes and to ascertain the extent of ASE in this tissue. Results Twenty-three family trios of placental cDNA, placental genomic DNA (gDNA and gDNA from both parents were tested for 130 candidate genes with the Sequenom MassArray system. Six genes were found differentially expressed but none imprinted. The Illumina ASE BeadArray platform was then used to test 1536 SNPs in 932 genes. The array was enriched for the human orthologues of 124 mouse candidate genes from bioinformatics predictions and 10 human candidate imprinted genes from EST database mining. After quality control pruning, a total of 261 informative SNPs (214 genes remained for analysis. Imprinting with maternal expression was demonstrated for the lymphocyte imprinted gene ZNF331 in human placenta. Two potential differentially methylated regions (DMRs were found in the vicinity of ZNF331. None of the bioinformatically predicted candidates tested showed imprinting except for a skewed allelic expression in a parent-specific manner observed for PHACTR2, a neighbour of the imprinted PLAGL1 gene. ASE was detected for two or more individuals in 39 candidate genes (18%. Conclusions Both Sequenom and Illumina assays were sensitive enough to study imprinting and strong allelic bias. Previous bioinformatics approaches were not predictive of new imprinted genes

  2. High-content, high-throughput analysis of cell cycle perturbations induced by the HSP90 inhibitor XL888.

    Directory of Open Access Journals (Sweden)

    Susan K Lyman

    Full Text Available BACKGROUND: Many proteins that are dysregulated or mutated in cancer cells rely on the molecular chaperone HSP90 for their proper folding and activity, which has led to considerable interest in HSP90 as a cancer drug target. The diverse array of HSP90 client proteins encompasses oncogenic drivers, cell cycle components, and a variety of regulatory factors, so inhibition of HSP90 perturbs multiple cellular processes, including mitogenic signaling and cell cycle control. Although many reports have investigated HSP90 inhibition in the context of the cell cycle, no large-scale studies have examined potential correlations between cell genotype and the cell cycle phenotypes of HSP90 inhibition. METHODOLOGY/PRINCIPAL FINDINGS: To address this question, we developed a novel high-content, high-throughput cell cycle assay and profiled the effects of two distinct small molecule HSP90 inhibitors (XL888 and 17-AAG [17-allylamino-17-demethoxygeldanamycin] in a large, genetically diverse panel of cancer cell lines. The cell cycle phenotypes of both inhibitors were strikingly similar and fell into three classes: accumulation in M-phase, G2-phase, or G1-phase. Accumulation in M-phase was the most prominent phenotype and notably, was also correlated with TP53 mutant status. We additionally observed unexpected complexity in the response of the cell cycle-associated client PLK1 to HSP90 inhibition, and we suggest that inhibitor-induced PLK1 depletion may contribute to the striking metaphase arrest phenotype seen in many of the M-arrested cell lines. CONCLUSIONS/SIGNIFICANCE: Our analysis of the cell cycle phenotypes induced by HSP90 inhibition in 25 cancer cell lines revealed that the phenotypic response was highly dependent on cellular genotype as well as on the concentration of HSP90 inhibitor and the time of treatment. M-phase arrest correlated with the presence of TP53 mutations, while G2 or G1 arrest was more commonly seen in cells bearing wt TP53. We draw

  3. Metagenomic analysis of taxa associated with Lutzomyia longipalpis, vector of visceral leishmaniasis, using an unbiased high-throughput approach.

    Directory of Open Access Journals (Sweden)

    Christina B McCarthy

    2011-09-01

    Full Text Available BACKGROUND: Leishmaniasis is one of the most diverse and complex of all vector-borne diseases worldwide. It is caused by parasites of the genus Leishmania, obligate intramacrophage protists characterised by diversity and complexity. Its most severe form is visceral leishmaniasis (VL, a systemic disease that is fatal if left untreated. In Latin America VL is caused by Leishmania infantum chagasi and transmitted by Lutzomyia longipalpis. This phlebotomine sandfly is only found in the New World, from Mexico to Argentina. In South America, migration and urbanisation have largely contributed to the increase of VL as a public health problem. Moreover, the first VL outbreak was recently reported in Argentina, which has already caused 7 deaths and 83 reported cases. METHODOLOGY/PRINCIPAL FINDINGS: An inventory of the microbiota associated with insect vectors, especially of wild specimens, would aid in the development of novel strategies for controlling insect vectors. Given the recent VL outbreak in Argentina and the compelling need to develop appropriate control strategies, this study focused on wild male and female Lu. longipalpis from an Argentine endemic (Posadas, Misiones and a Brazilian non-endemic (Lapinha Cave, Minas Gerais VL location. Previous studies on wild and laboratory reared female Lu. longipalpis have described gut bacteria using standard bacteriological methods. In this study, total RNA was extracted from the insects and submitted to high-throughput pyrosequencing. The analysis revealed the presence of sequences from bacteria, fungi, protist parasites, plants and metazoans. CONCLUSIONS/SIGNIFICANCE: This is the first time an unbiased and comprehensive metagenomic approach has been used to survey taxa associated with an infectious disease vector. The identification of gregarines suggested they are a possible efficient control method under natural conditions. Ongoing studies are determining the significance of the associated taxa found

  4. Cost-effectiveness analysis of rotavirus vaccination in Argentina.

    Science.gov (United States)

    Urueña, Analía; Pippo, Tomás; Betelu, María Sol; Virgilio, Federico; Hernández, Laura; Giglio, Norberto; Gentile, Ángela; Diosque, Máximo; Vizzotti, Carla

    2015-05-07

    Rotavirus is a leading cause of severe diarrhea in children under 5. In Argentina, the most affected regions are the Northeast and Northwest, where hospitalizations and deaths are more frequent. This study estimated the cost-effectiveness of adding either of the two licensed rotavirus vaccines to the routine immunization schedule. The integrated TRIVAC vaccine cost-effectiveness model from the Pan American Health Organization's ProVac Initiative (Version 2.0) was used to assess health benefits, costs savings, life-years gained (LYGs), DALYs averted, and cost/DALY averted of vaccinating 10 successive cohorts, from the health care system and societal perspectives. Two doses of monovalent (RV1) rotavirus vaccine and three doses of pentavalent (RV5) rotavirus vaccine were each compared to a scenario assuming no vaccination. The price/dose was US$ 7.50 and US$ 5.15 for RV1 and RV5, respectively. We ran both a national and sub-national analysis, discounting all costs and benefits 3% annually. Our base case results were compared to a range of alternative univariate and multivariate scenarios. The number of LYGs was 5962 and 6440 for RV1 and RV5, respectively. The cost/DALY averted when compared to no vaccination from the health care system and societal perspective was: US$ 3870 and US$ 1802 for RV1, and US$ 2414 and US$ 358 for RV5, respectively. Equivalent figures for the Northeast were US$ 1470 and US$ 636 for RV1, and US$ 913 and US$ 80 for RV5. Therefore, rotavirus vaccination was more cost-effective in the Northeast compared to the whole country; and, in the Northwest, health service's costs saved outweighed the cost of introducing the vaccine. Vaccination with either vaccine compared to no vaccination was highly cost-effective based on WHO guidelines and Argentina's 2011 per capita GDP of US$ 9090. Key variables influencing results were vaccine efficacy, annual loss of efficacy, relative coverage of deaths, vaccine price, and discount rate. Compared to no

  5. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hui Su

    2001-05-25

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm{sup 2} for 40-{micro}m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  6. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  7. Using Cost-Effectiveness Analysis to Address Health Equity Concerns.

    Science.gov (United States)

    Cookson, Richard; Mirelman, Andrew J; Griffin, Susan; Asaria, Miqdad; Dawkins, Bryony; Norheim, Ole Frithjof; Verguet, Stéphane; J Culyer, Anthony

    2017-02-01

    This articles serves as a guide to using cost-effectiveness analysis (CEA) to address health equity concerns. We first introduce the "equity impact plane," a tool for considering trade-offs between improving total health-the objective underpinning conventional CEA-and equity objectives, such as reducing social inequality in health or prioritizing the severely ill. Improving total health may clash with reducing social inequality in health, for example, when effective delivery of services to disadvantaged communities requires additional costs. Who gains and who loses from a cost-increasing health program depends on differences among people in terms of health risks, uptake, quality, adherence, capacity to benefit, and-crucially-who bears the opportunity costs of diverting scarce resources from other uses. We describe two main ways of using CEA to address health equity concerns: 1) equity impact analysis, which quantifies the distribution of costs and effects by equity-relevant variables, such as socioeconomic status, location, ethnicity, sex, and severity of illness; and 2) equity trade-off analysis, which quantifies trade-offs between improving total health and other equity objectives. One way to analyze equity trade-offs is to count the cost of fairer but less cost-effective options in terms of health forgone. Another method is to explore how much concern for equity is required to choose fairer but less cost-effective options using equity weights or parameters. We hope this article will help the health technology assessment community navigate the practical options now available for conducting equity-informative CEA that gives policymakers a better understanding of equity impacts and trade-offs.

  8. FLASH Assembly of TALENs Enables High-Throughput Genome Editing

    OpenAIRE

    Reyon, Deepak; Tsai, Shengdar Q.; Khayter, Cyd; Foden, Jennifer A.; Sander, Jeffry D.; Joung, J. Keith

    2012-01-01

    Engineered transcription activator-like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the Fast Ligation-based Automatable Solid-phase High-throughput (FLASH) platform, a rapid and cost-effective method we developed to enable ...

  9. Cost effectiveness analysis of hemiarthroplasty and total shoulder arthroplasty.

    Science.gov (United States)

    Mather, Richard C; Watters, Tyler S; Orlando, Lori A; Bolognesi, Michael P; Moorman, Claude T

    2010-04-01

    Total shoulder arthroplasty (TSA) and hemiarthroplasty (HA) are two viable surgical treatment options for glenohumeral osteoarthritis. Recent systematic reviews and randomized trials suggest that TSA, while more costly initially, may have superior outcomes with regard to pain, function and quality of life with lower revision rates. This study compared the cost-effectiveness of TSA with HA. A Markov decision model was constructed for a cost-utility analysis of TSA compared to HA in a cohort of 64-year-old patients. Outcome probabilities and effectiveness were derived from the literature. Costs were estimated from the societal perspective using the national average Medicare reimbursement for the procedures in 2008 US dollars. Effectiveness was expressed in quality-adjusted life years (QALYs) gained. Principal outcome measures were average incremental costs, incremental effectiveness, incremental QALYs, and net health benefits. In the base case, HA resulted in a lower number of average QALYs gained at a higher average cost to society and was, therefore, dominated by the TSA strategy for the treatment of glenohumeral osteoarthritis. The cost effectiveness ratio for TSA and HA were $957/QALY and $1,194/QALY respectively. Sensitivity analysis revealed that if the utility of TSA is equal to, or revision rate lower than HA, TSA continues to be a dominant strategy. Total shoulder arthroplasty with a cemented glenoid is a cost-effective procedure, resulting in greater utility for the patient at a lower overall cost to the payer. These findings suggest that TSA is the preferred treatment for certain populations from both a patient and payer perspective. 2010 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  10. Strengthening Cost-Effectiveness Analysis for Public Health Policy.

    Science.gov (United States)

    Russell, Louise B; Sinha, Anushua

    2016-05-01

    Although the U.S. spends more on medical care than any country in the world, Americans live shorter lives than the citizens of other high-income countries. Many important opportunities to improve this record lie outside the health sector and involve improving the conditions in which Americans live and work: safe design and maintenance of roads, bridges, train tracks, and airports; control of environmental pollutants; occupational safety; healthy buildings; a safe and healthy food supply; safe manufacture of consumer products; a healthy social environment; and others. Faced with the overwhelming array of possibilities, U.S. decision makers need help identifying those that can contribute the most to health. Cost-effectiveness analysis is designed to serve that purpose, but has mainly been used to assess interventions within the health sector. This paper briefly reviews the objective of cost-effectiveness analysis and its methodologic evolution and discusses the issues that arise when it is used to evaluate interventions that fall outside the health sector under three headings: structuring the analysis, quantifying/measuring benefits and costs, and valuing benefits and costs.

  11. Product Chemistry and Process Efficiency of Biomass Torrefaction, Pyrolysis and Gasification Studied by High-Throughput Techniques and Multivariate Analysis

    Science.gov (United States)

    Xiao, Li

    Despite the great passion and endless efforts on development of renewable energy from biomass, the commercialization and scale up of biofuel production is still under pressure and facing challenges. New ideas and facilities are being tested around the world targeting at reducing cost and improving product value. Cutting edge technologies involving analytical chemistry, statistics analysis, industrial engineering, computer simulation, and mathematics modeling, etc. keep integrating modern elements into this classic research. One of those challenges of commercializing biofuel production is the complexity from chemical composition of biomass feedstock and the products. Because of this, feedstock selection and process optimization cannot be conducted efficiently. This dissertation attempts to further evaluate biomass thermal decomposition process using both traditional methods and advanced technique (Pyrolysis Molecular Beam Mass Spectrometry). Focus has been made on data base generation of thermal decomposition products from biomass at different temperatures, finding out the relationship between traditional methods and advanced techniques, evaluating process efficiency and optimizing reaction conditions, comparison of typically utilized biomass feedstock and new search on innovative species for economical viable feedstock preparation concepts, etc. Lab scale quartz tube reactors and 80il stainless steel sample cups coupled with auto-sampling system were utilized to simulate the complicated reactions happened in real fluidized or entrained flow reactors. Two main high throughput analytical techniques used are Near Infrared Spectroscopy (NIR) and Pyrolysis Molecular Beam Mass Spectrometry (Py-MBMS). Mass balance, carbon balance, and product distribution are presented in detail. Variations of thermal decomposition temperature range from 200°C to 950°C. Feedstocks used in the study involve typical hardwood and softwood (red oak, white oak, yellow poplar, loblolly pine

  12. Cost-effectiveness analysis in markets with high fixed costs.

    Science.gov (United States)

    Cutler, David M; Ericson, Keith M Marzilli

    2010-01-01

    We consider how to conduct cost-effectiveness analysis when the social cost of a resource differs from the posted price. From the social perspective, the true cost of a medical intervention is the marginal cost of delivering another unit of a treatment, plus the social cost (deadweight loss) of raising the revenue to fund the treatment. We focus on pharmaceutical prices, which have high markups over marginal cost due to the monopoly power granted to pharmaceutical companies when drugs are under patent. We find that the social cost of a branded drug is approximately one-half the market price when the treatment is paid for by a public insurance plan and one-third the market price for mandated coverage by private insurance. We illustrate the importance of correctly accounting for social costs using two examples: coverage for statin drugs and approval for a drug to treat kidney cancer (sorafenib). In each case, we show that the correct social perspective for cost-effectiveness analysis would be more lenient than researcher recommendations.

  13. REVASCULARIZATION FOR FEMOROPOPLITEAL DISEASE - A DECISION AND COST-EFFECTIVENESS ANALYSIS

    NARCIS (Netherlands)

    HUNINK, MGM; WONG, JB; DONALDSON, MC; MEYEROVITZ, MF; DEVRIES, J; HARRINGTON, DP

    1995-01-01

    Objective.-To evaluate the relative benefits and cost-effectiveness of revascularization for femoropopliteal disease using percutaneous transluminal angioplasty or bypass surgery. Design.-Decision analysis using a multistate transition simulation model (Markov process) and cost-effectiveness analysi

  14. Accurate CpG and non-CpG cytosine methylation analysis by high-throughput locus-specific pyrosequencing in plants.

    Science.gov (United States)

    How-Kit, Alexandre; Daunay, Antoine; Mazaleyrat, Nicolas; Busato, Florence; Daviaud, Christian; Teyssier, Emeline; Deleuze, Jean-François; Gallusci, Philippe; Tost, Jörg

    2015-07-01

    Pyrosequencing permits accurate quantification of DNA methylation of specific regions where the proportions of the C/T polymorphism induced by sodium bisulfite treatment of DNA reflects the DNA methylation level. The commercially available high-throughput locus-specific pyrosequencing instruments allow for the simultaneous analysis of 96 samples, but restrict the DNA methylation analysis to CpG dinucleotide sites, which can be limiting in many biological systems. In contrast to mammals where DNA methylation occurs nearly exclusively on CpG dinucleotides, plants genomes harbor DNA methylation also in other sequence contexts including CHG and CHH motives, which cannot be evaluated by these pyrosequencing instruments due to software limitations. Here, we present a complete pipeline for accurate CpG and non-CpG cytosine methylation analysis at single base-resolution using high-throughput locus-specific pyrosequencing. The devised approach includes the design and validation of PCR amplification on bisulfite-treated DNA and pyrosequencing assays as well as the quantification of the methylation level at every cytosine from the raw peak intensities of the Pyrograms by two newly developed Visual Basic Applications. Our method presents accurate and reproducible results as exemplified by the cytosine methylation analysis of the promoter regions of two Tomato genes (NOR and CNR) encoding transcription regulators of fruit ripening during different stages of fruit development. Our results confirmed a significant and temporally coordinated loss of DNA methylation on specific cytosines during the early stages of fruit development in both promoters as previously shown by WGBS. The manuscript describes thus the first high-throughput locus-specific DNA methylation analysis in plants using pyrosequencing.

  15. Transcriptome analysis of the variations between autotetraploid Paulownia tomentosa and its diploid using high-throughput sequencing.

    Science.gov (United States)

    Fan, Guoqiang; Wang, Limin; Deng, Minjie; Niu, Suyan; Zhao, Zhenli; Xu, Enkai; Cao, Xibin; Zhang, Xiaoshen

    2015-08-01

    Timber properties of autotetraploid Paulownia tomentosa are heritable with whole genome duplication, but the molecular mechanisms for the predominant characteristics remain unclear. To illuminate the genetic basis, high-throughput sequencing technology was used to identify the related unigenes. 2677 unigenes were found to be significantly differentially expressed in autotetraploid P. tomentosa. In total, 30 photosynthesis-related, 21 transcription factor-related, and 22 lignin-related differentially expressed unigenes were detected, and the roles of the peroxidase in lignin biosynthesis, MYB DNA-binding proteins, and WRKY proteins associated with the regulation of relevant hormones are extensively discussed. The results provide transcriptome data that may bring a new perspective to explain the polyploidy mechanism in the long growth cycle of plants and offer some help to the future Paulownia breeding.

  16. High-throughput sequencing and metagenomics: moving forward in the culture-independent analysis of food microbial ecology.

    Science.gov (United States)

    Ercolini, Danilo

    2013-05-01

    Following recent trends in environmental microbiology, food microbiology has benefited from the advances in molecular biology and adopted novel strategies to detect, identify, and monitor microbes in food. An in-depth study of the microbial diversity in food can now be achieved by using high-throughput sequencing (HTS) approaches after direct nucleic acid extraction from the sample to be studied. In this review, the workflow of applying culture-independent HTS to food matrices is described. The current scenario and future perspectives of HTS uses to study food microbiota are presented, and the decision-making process leading to the best choice of working conditions to fulfill the specific needs of food research is described.

  17. Leishmania genome analysis and high-throughput immunological screening identifies tuzin as a novel vaccine candidate against visceral leishmaniasis.

    Science.gov (United States)

    Lakshmi, Bhavana Sethu; Wang, Ruobing; Madhubala, Rentala

    2014-06-24

    Leishmaniasis is a neglected tropical disease caused by Leishmania species. It is a major health concern affecting 88 countries and threatening 350 million people globally. Unfortunately, there are no vaccines and there are limitations associated with the current therapeutic regimens for leishmaniasis. The emerging cases of drug-resistance further aggravate the situation, demanding rapid drug and vaccine development. The genome sequence of Leishmania, provides access to novel genes that hold potential as chemotherapeutic targets or vaccine candidates. In this study, we selected 19 antigenic genes from about 8000 common Leishmania genes based on the Leishmania major and Leishmania infantum genome information available in the pathogen databases. Potential vaccine candidates thus identified were screened using an in vitro high throughput immunological platform developed in the laboratory. Four candidate genes coding for tuzin, flagellar glycoprotein-like protein (FGP), phospholipase A1-like protein (PLA1) and potassium voltage-gated channel protein (K VOLT) showed a predominant protective Th1 response over disease exacerbating Th2. We report the immunogenic properties and protective efficacy of one of the four antigens, tuzin, as a DNA vaccine against Leishmania donovani challenge. Our results show that administration of tuzin DNA protected BALB/c mice against L. donovani challenge and that protective immunity was associated with higher levels of IFN-γ and IL-12 production in comparison to IL-4 and IL-10. Our study presents a simple approach to rapidly identify potential vaccine candidates using the exhaustive information stored in the genome and an in vitro high-throughput immunological platform.

  18. Data Management for High-Throughput Genomics

    CERN Document Server

    Roehm, Uwe

    2009-01-01

    Today's sequencing technology allows sequencing an individual genome within a few weeks for a fraction of the costs of the original Human Genome project. Genomics labs are faced with dozens of TB of data per week that have to be automatically processed and made available to scientists for further analysis. This paper explores the potential and the limitations of using relational database systems as the data processing platform for high-throughput genomics. In particular, we are interested in the storage management for high-throughput sequence data and in leveraging SQL and user-defined functions for data analysis inside a database system. We give an overview of a database design for high-throughput genomics, how we used a SQL Server database in some unconventional ways to prototype this scenario, and we will discuss some initial findings about the scalability and performance of such a more database-centric approach.

  19. Portable kit for high-throughput analysis of polycyclic aromatic hydrocarbons using surface enhanced Raman scattering after dispersive liquid-liquid microextraction.

    Science.gov (United States)

    Zhang, Min; Zhang, Xiaoli; Qu, Baofeng; Zhan, Jinhua

    2017-12-01

    In this work, a portable kit was developed for convenient high-throughput trace analysis of polycyclic aromatic hydrocarbons (PAHs) using surface enhanced Raman scattering (SERS) after dispersive liquid-liquid microextraction (DLLME) process. This portable kit contains three sealed reagent tubes (labeled as T1, T2 and T3), a self-made well plate, and a portable Raman spectrometer. The reagent tube T1 contains a mixture of disperser solvent and extraction solvent, which involved a 2min sample pretreatment of DLLME process. The quick injection of solvents in tube T1 into the sample containing PAHs formed a cloudy solution immediately, which consists of fine droplets of extraction solvent dispersed entirely into aqueous phase. The enrichment factor was found to be 29.6. T2 and T3 contain methanol and 1-propanethiol-modified silver nanoparticles (PTH-Ag NPs), respectively. The liquid in the tube T3 was used to enhance the Raman signal of analytes on the self-made high-throughput micro reactor. A linear relationship between the concentration of pyrene and the relative Raman peak intensity was obtained (R(2) = 0.993). The detection limit was 0.50μgL(-1) for pyrene. RSD of the high-throughput analysis of 12 samples was calculated as 4.8%. The ability of DLLME-SERS technique in the extraction of PAH isomers from water samples were investigated. The performance of DLLME-SERS in the recovery of pyrene from lake, spring and drinking water was also studied. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. DDBJ read annotation pipeline: a cloud computing-based pipeline for high-throughput analysis of next-generation sequencing data.

    Science.gov (United States)

    Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu

    2013-08-01

    High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.

  1. Computational Analysis and In silico Predictive Modeling for Inhibitors of PhoP Regulon in S. typhi on High-Throughput Screening Bioassay Dataset.

    Science.gov (United States)

    Kaur, Harleen; Ahmad, Mohd; Scaria, Vinod

    2016-03-01

    There is emergence of multidrug-resistant Salmonella enterica serotype typhi in pandemic proportions throughout the world, and therefore, there is a necessity to speed up the discovery of novel molecules having different modes of action and also less influenced by the resistance formation that would be used as drug for the treatment of salmonellosis particularly typhoid fever. The PhoP regulon is well studied and has now been shown to be a critical regulator of number of gene expressions which are required for intracellular survival of S. enterica and pathophysiology of disease like typhoid. The evident roles of two-component PhoP-/PhoQ-regulated products in salmonella virulence have motivated attempts to target them therapeutically. Although the discovery process of biologically active compounds for the treatment of typhoid relies on hit-finding procedure, using high-throughput screening technology alone is very expensive, as well as time consuming when performed on large scales. With the recent advancement in combinatorial chemistry and contemporary technique for compounds synthesis, there are more and more compounds available which give ample growth of diverse compound library, but the time and endeavor required to screen these unfocused massive and diverse library have been slightly reduced in the past years. Hence, there is demand to improve the high-quality hits and success rate for high-throughput screening that required focused and biased compound library toward the particular target. Therefore, we still need an advantageous and expedient method to prioritize the molecules that will be utilized for biological screens, which saves time and is also inexpensive. In this concept, in silico methods like machine learning are widely applicable technique used to build computational model for high-throughput virtual screens to prioritize molecules for advance study. Furthermore, in computational analysis, we extended our study to identify the common enriched

  2. High throughput analysis at microscale: performance of ionKey/MS with Xevo G2-XS QTof under rapid gradient conditions

    Directory of Open Access Journals (Sweden)

    Yun Wang Alelyunas

    2015-10-01

    Full Text Available In this paper, high throughput analysis with 3 minute, rapid gradient conditions is described using the ionKey/MS™ System with an integrated ACQUITY UPLC® M-Class System and the Xevo® G2-XS QTof Mass Spectrometer. Extensive testing of representative small molecules and a peptide shows that the system is well-tolerated and exhibits excellent reproducibility and linear response. The iKey™ HSS T3 Separation Device used is robust, withstanding ~2200 injections of prepared human plasma with excellent peak shape and system pressure profile. A 99% solvent savings was realized when compared with an analytical system using a 2.1 mm column with flow rate ranging from 0.6 mL/min to 1.5 mL/min. These data, coupled with examples from the literature, illustrate that the ionKey/MS System with Xevo G2-XS QTof can be used as a full service platform for high throughput analysis and high sensitivity analysis to support all phases of drug discovery and development.

  3. High-throughput sequencing-based analysis of endogenetic fungal communities inhabiting the Chinese Cordyceps reveals unexpectedly high fungal diversity.

    Science.gov (United States)

    Xia, Fei; Chen, Xin; Guo, Meng-Yuan; Bai, Xiao-Hui; Liu, Yan; Shen, Guang-Rong; Li, Yu-Ling; Lin, Juan; Zhou, Xuan-Wei

    2016-09-14

    Chinese Cordyceps, known in Chinese as "DongChong XiaCao", is a parasitic complex of a fungus (Ophiocordyceps sinensis) and a caterpillar. The current study explored the endogenetic fungal communities inhabiting Chinese Cordyceps. Samples were collected from five different geographical regions of Qinghai and Tibet, and the nuclear ribosomal internal transcribed spacer-1 sequences from each sample were obtained using Illumina high-throughput sequencing. The results showed that Ascomycota was the dominant fungal phylum in Chinese Cordyceps and its soil microhabitat from different sampling regions. Among the Ascomycota, 65 genera were identified, and the abundant operational taxonomic units showed the strongest sequence similarity to Ophiocordyceps, Verticillium, Pseudallescheria, Candida and Ilyonectria Not surprisingly, the genus Ophiocordyceps was the largest among the fungal communities identified in the fruiting bodies and external mycelial cortices of Chinese Cordyceps. In addition, fungal communities in the soil microhabitats were clustered separately from the external mycelial cortices and fruiting bodies of Chinese Cordyceps from different sampling regions. There was no significant structural difference in the fungal communities between the fruiting bodies and external mycelial cortices of Chinese Cordyceps. This study revealed an unexpectedly high diversity of fungal communities inhabiting the Chinese Cordyceps and its microhabitats.

  4. High-throughput sequence-based analysis of the bacterial composition of kefir and an associated kefir grain.

    Science.gov (United States)

    Dobson, Alleson; O'Sullivan, Orla; Cotter, Paul D; Ross, Paul; Hill, Colin

    2011-07-01

    Lacticin 3147 is a two-peptide broad spectrum lantibiotic produced by Lactococcus lactis DPC3147 shown to inhibit a number of clinically relevant Gram-positive pathogens. Initially isolated from an Irish kefir grain, lacticin 3147 is one of the most extensively studied lantibiotics to date. In this study, the bacterial diversity of the Irish kefir grain from which L. lactis DPC3147 was originally isolated was for the first time investigated using a high-throughput parallel sequencing strategy. A total of 17 416 unique V4 variable regions of the 16S rRNA gene were analysed from both the kefir starter grain and its derivative kefir-fermented milk. Firmicutes (which includes the lactic acid bacteria) was the dominant phylum accounting for > 92% of sequences. Within the Firmicutes, dramatic differences in abundance were observed when the starter grain and kefir milk fermentate were compared. The kefir grain-associated bacterial community was largely composed of the Lactobacillaceae family while Streptococcaceae (primarily Lactococcus spp.) was the dominant family within the kefir milk fermentate. Sequencing data confirmed previous findings that the microbiota of kefir milk and the starter grain are quite different while at the same time, establishing that the microbial diversity of the starter grain is not uniform with a greater level of diversity associated with the interior kefir starter grain compared with the exterior. © 2011 Teagasc Food Research Centre, Moorepark. FEMS Microbiology Letters © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd.

  5. Taxonomic analysis of the microbial community in stored sugar beets using high-throughput sequencing of different marker genes.

    Science.gov (United States)

    Liebe, Sebastian; Wibberg, Daniel; Winkler, Anika; Pühler, Alfred; Schlüter, Andreas; Varrelmann, Mark

    2016-02-01

    Post-harvest colonization of sugar beets accompanied by rot development is a serious problem due to sugar losses and negative impact on processing quality. Studies on the microbial community associated with rot development and factors shaping their structure are missing. Therefore, high-throughput sequencing was applied to describe the influence of environment, plant genotype and storage temperature (8°C and 20°C) on three different communities in stored sugar beets, namely fungi (internal transcribed spacers 1 and 2), Fusarium spp. (elongation factor-1α gene fragment) and oomycetes (internal transcribed spacers 1). The composition of the fungal community changed during storage mostly influenced by the storage temperature followed by a weak environmental effect. Botrytis cinerea was the prevalent species at 8°C whereas members of the fungal genera Fusarium and Penicillium became dominant at 20°C. This shift was independent of the plant genotype. Species richness within the genus Fusarium also increased during storage at both temperatures whereas the oomycetes community did not change. Moreover, oomycetes species were absent after storage at 20°C. The results of the present study clearly show that rot development during sugar beet storage is associated with pathogens well known as causal agents of post-harvest diseases in many other crops.

  6. High-throughput DNA sequence analysis reveals stable engraftment of gut microbiota following transplantation of previously frozen fecal bacteria.

    Science.gov (United States)

    Hamilton, Matthew J; Weingarden, Alexa R; Unno, Tatsuya; Khoruts, Alexander; Sadowsky, Michael J

    2013-01-01

    Fecal microbiota transplantation (FMT) is becoming a more widely used technology for treatment of recurrent Clostridum difficile infection (CDI). While previous treatments used fresh fecal slurries as a source of microbiota for FMT, we recently reported the successful use of standardized, partially purified and frozen fecal microbiota to treat CDI. Here we report that high-throughput 16S rRNA gene sequencing showed stable engraftment of gut microbiota following FMT using frozen fecal bacteria from a healthy donor. Similar bacterial taxa were found in post-transplantation samples obtained from the recipients and donor samples, but the relative abundance varied considerably between patients and time points. Post FMT samples from patients showed an increase in the abundance of Firmicutes and Bacteroidetes, representing 75-80% of the total sequence reads. Proteobacteria and Actinobacteria were less abundant (fecal microbiota from a healthy donor can be used to effectively treat recurrent CDI resulting in restoration of the structure of gut microbiota and clearing of Clostridum difficile.

  7. YODA: Software to facilitate high-throughput analysis of chronological life span, growth rate, and survival in budding yeast

    Directory of Open Access Journals (Sweden)

    Murakami Christopher J

    2010-03-01

    Full Text Available Abstract Background The budding yeast Saccharomyces cerevisiae is one of the most widely studied model organisms in aging-related science. Although several genetic modifiers of yeast longevity have been identified, the utility of this system for longevity studies has been limited by a lack of high-throughput assays for quantitatively measuring survival of individual yeast cells during aging. Results Here we describe the Yeast Outgrowth Data Analyzer (YODA, an automated system for analyzing population survival of yeast cells based on the kinetics of outgrowth measured by optical density over time. YODA has been designed specifically for quantification of yeast chronological life span, but can also be used to quantify growth rate and survival of yeast cells in response to a variety of different conditions, including temperature, nutritional composition of the growth media, and chemical treatments. YODA is optimized for use with a Bioscreen C MBR shaker/incubator/plate reader, but is also amenable to use with any standard plate reader or spectrophotometer. Conclusions We estimate that use of YODA as described here reduces the effort and resources required to measure chronological life span and analyze the resulting data by at least 15-fold.

  8. Automated, high-throughput, motility analysis in Caenorhabditis elegans and parasitic nematodes: Applications in the search for new anthelmintics

    Directory of Open Access Journals (Sweden)

    Steven D. Buckingham

    2014-12-01

    Full Text Available The scale of the damage worldwide to human health, animal health and agricultural crops resulting from parasitic nematodes, together with the paucity of treatments and the threat of developing resistance to the limited set of widely-deployed chemical tools, underlines the urgent need to develop novel drugs and chemicals to control nematode parasites. Robust chemical screens which can be automated are a key part of that discovery process. Hitherto, the successful automation of nematode behaviours has been a bottleneck in the chemical discovery process. As the measurement of nematode motility can provide a direct scalar readout of the activity of the neuromuscular system and an indirect measure of the health of the animal, this omission is acute. Motility offers a useful assay for high-throughput, phenotypic drug/chemical screening and several recent developments have helped realise, at least in part, the potential of nematode-based drug screening. Here we review the challenges encountered in automating nematode motility and some important developments in the application of machine vision, statistical imaging and tracking approaches which enable the automated characterisation of nematode movement. Such developments facilitate automated screening for new drugs and chemicals aimed at controlling human and animal nematode parasites (anthelmintics and plant nematode parasites (nematicides.

  9. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    Science.gov (United States)

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  10. High-Throughput and Low-Cost Analysis of Trace Volatile Phthalates in Seafood by Online Coupling of Monolithic Capillary Adsorbent with GC-MS.

    Science.gov (United States)

    Insuan, Wimonrut; Khawmodjod, Phatchara; Whitlow, Harry J; Soonthondecha, Peerapong; Malem, Fairda; Chienthavorn, Orapin

    2016-04-27

    A simple, sensitive, and high-throughput method was developed for the determination of six volatile phthalate esters-dimethyl phthalate (DMP), diethyl phthalate (DEP), dibutyl phthalate (DBP), benzylbutyl phthalate (BBP), di(2-ethylhexyl) phthalate (DEHP), and di-n-octyl phthalate (DnOP)-in seafood samples by using monolith adsorbent in a capillary coupled to a gas chromatography-mass spectrometry (GC-MS) system. The freeze-dried samples were subjected to an ultrasonication with hexane, followed by vortex mixing. The liquid extract was quantitatively determined by a direct application to an online silica monolith capillary adsorbent coupled with a gas chromatograph with mass spectrometric detection. Method validation in seafood matrix gave recoveries of 72.8-85.4% and a detection limit of 6.8-10.0 ng g(-1) for bivalve samples. Reusability of the monolith capillary for trapping coextracted matrix was up to six times, allowing high-throughput analysis at the parts per billion level. When compared with the Food and Environment Research Agency (FERA) method, no significant difference in the result was observed, confirming the method was valid and applicable for the routine analysis of phthalates in seafood samples for food and environmental laboratories.

  11. Direct analysis in real time - high resolution mass spectrometry (DART-HRMS): a high throughput strategy for identification and quantification of anabolic steroid esters.

    Science.gov (United States)

    Doué, Mickael; Dervilly-Pinel, Gaud; Pouponneau, Karinne; Monteau, Fabrice; Le Bizec, Bruno

    2015-07-01

    High throughput screening is essential for doping, forensic, and food safety laboratories. While hyphenated chromatography-mass spectrometry (MS) remains the approach of choice, recent ambient MS techniques, such as direct analysis in real time (DART), offer more rapid and more versatile strategies and thus gain in popularity. In this study, the potential of DART hyphenated with Orbitrap-MS for fast identification and quantification of 21 anabolic steroid esters has been evaluated. Direct analysis in high resolution scan mode allowed steroid esters screening by accurate mass measurement (Resolution = 60 000 and mass error  0.99), dynamic range (from 1 to 1000 ng mL(-1) ), bias (<10%), sensitivity (1 ng mL(-1) ), repeatability and reproducibility (RSD < 20%) were evaluated as similar to those obtained with hyphenated chromatography-mass spectrometry techniques. This innovative high throughput approach was successfully applied for the characterization of oily commercial preparations, and thus fits the needs of the competent authorities in the fight against forbidden or counterfeited substances.

  12. (Correcting misdiagnoses of asthma: a cost effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Vandemheen Katherine

    2011-05-01

    Full Text Available Abstract Background The prevalence of physician-diagnosed-asthma has risen over the past three decades and misdiagnosis of asthma is potentially common. Objective: to determine whether a secondary-screening-program to establish a correct diagnosis of asthma in those who report a physician diagnosis of asthma is cost effective. Method Randomly selected physician-diagnosed-asthmatic subjects from 8 Canadian cities were studied with an extensive diagnostic algorithm to rule-in, or rule-out, a correct diagnosis of asthma. Subjects in whom the diagnosis of asthma was excluded were followed up for 6-months and data on asthma medications and heath care utilization was obtained. Economic analysis was performed to estimate the incremental lifetime costs associated with secondary screening of previously diagnosed asthmatic subjects. Analysis was from the perspective of the Canadian healthcare system and is reported in Canadian dollars. Results Of 540 randomly selected patients with physician diagnosed asthma 150 (28%; 95%CI 19-37% did not have asthma when objectively studied. 71% of these misdiagnosed patients were on some asthma medications. Incorporating the incremental cost of secondary-screening for the diagnosis of asthma, we found that the average cost savings per 100 individuals screened was $35,141 (95%CI $4,588-$69,278. Conclusion Cost savings primarily resulted from lifetime costs of medication use averted in those who had been misdiagnosed.

  13. Temporal dynamics of soil microbial communities under different moisture regimes: high-throughput sequencing and bioinformatics analysis

    Science.gov (United States)

    Semenov, Mikhail; Zhuravleva, Anna; Semenov, Vyacheslav; Yevdokimov, Ilya; Larionova, Alla

    2017-04-01

    Recent climate scenarios predict not only continued global warming but also an increased frequency and intensity of extreme climatic events such as strong changes in temperature and precipitation regimes. Microorganisms are well known to be more sensitive to changes in environmental conditions than to other soil chemical and physical parameters. In this study, we determined the shifts in soil microbial community structure as well as indicative taxa in soils under three moisture regimes using high-throughput Illumina sequencing and range of bioinformatics approaches for the assessment of sequence data. Incubation experiments were performed in soil-filled (Greyic Phaeozems Albic) rhizoboxes with maize and without plants. Three contrasting moisture regimes were being simulated: 1) optimal wetting (OW), a watering 2-3 times per week to maintain soil moisture of 20-25% by weight; 2) periodic wetting (PW), with alternating periods of wetting and drought; and 3) constant insufficient wetting (IW), while soil moisture of 12% by weight was permanently maintained. Sampled fresh soils were homogenized, and the total DNA of three replicates was extracted using the FastDNA® SPIN kit for Soil. DNA replicates were combined in a pooled sample and the DNA was used for PCR with specific primers for the 16S V3 and V4 regions. In order to compare variability between different samples and replicates within a single sample, some DNA replicates treated separately. The products were purified and submitted to Illumina MiSeq sequencing. Sequence data were evaluated by alpha-diversity (Chao1 and Shannon H' diversity indexes), beta-diversity (UniFrac and Bray-Curtis dissimilarity), heatmap, tagcloud, and plot-bar analyses using the MiSeq Reporter Metagenomics Workflow and R packages (phyloseq, vegan, tagcloud). Shannon index varied in a rather narrow range (4.4-4.9) with the lowest values for microbial communities under PW treatment. Chao1 index varied from 385 to 480, being a more flexible

  14. The logic of EGFR/ErbB signaling: theoretical properties and analysis of high-throughput data.

    Science.gov (United States)

    Samaga, Regina; Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Sorger, Peter K; Klamt, Steffen

    2009-08-01

    The epidermal growth factor receptor (EGFR) signaling pathway is probably the best-studied receptor system in mammalian cells, and it also has become a popular example for employing mathematical modeling to cellular signaling networks. Dynamic models have the highest explanatory and predictive potential; however, the lack of kinetic information restricts current models of EGFR signaling to smaller sub-networks. This work aims to provide a large-scale qualitative model that comprises the main and also the side routes of EGFR/ErbB signaling and that still enables one to derive important functional properties and predictions. Using a recently introduced logical modeling framework, we first examined general topological properties and the qualitative stimulus-response behavior of the network. With species equivalence classes, we introduce a new technique for logical networks that reveals sets of nodes strongly coupled in their behavior. We also analyzed a model variant which explicitly accounts for uncertainties regarding the logical combination of signals in the model. The predictive power of this model is still high, indicating highly redundant sub-structures in the network. Finally, one key advance of this work is the introduction of new techniques for assessing high-throughput data with logical models (and their underlying interaction graph). By employing these techniques for phospho-proteomic data from primary hepatocytes and the HepG2 cell line, we demonstrate that our approach enables one to uncover inconsistencies between experimental results and our current qualitative knowledge and to generate new hypotheses and conclusions. Our results strongly suggest that the Rac/Cdc42 induced p38 and JNK cascades are independent of PI3K in both primary hepatocytes and HepG2. Furthermore, we detected that the activation of JNK in response to neuregulin follows a PI3K-dependent signaling pathway.

  15. A 96-well screen filter plate for high-throughput biological sample preparation and LC-MS/MS analysis.

    Science.gov (United States)

    Peng, Sean X; Cousineau, Martin; Juzwin, Stephen J; Ritchie, David M

    2006-01-01

    A novel 96-well screen filter plate (patent pending) has been invented to eliminate a time-consuming and labor-intensive step in preparation of in vivo study samples--to remove blood or plasma clots. These clots plug the pipet tips during a manual or automated sample-transfer step causing inaccurate pipetting or total pipetting failure. Traditionally, these blood and plasma clots are removed by picking them out manually one by one from each sample tube before any sample transfer can be made. This has significantly slowed the sample preparation process and has become a bottleneck for automated high-throughput sample preparation using robotic liquid handlers. Our novel screen filter plate was developed to solve this problem. The 96-well screen filter plate consists of 96 stainless steel wire-mesh screen tubes connected to the 96 openings of a top plate so that the screen filter plate can be readily inserted into a 96-well sample storage plate. Upon insertion, the blood and plasma clots are excluded from entering the screen tube while clear sample solutions flow freely into it. In this way, sample transfer can be easily completed by either manual or automated pipetting methods. In this report, three structurally diverse compounds were selected to evaluate and validate the use of the screen filter plate. The plasma samples of these compounds were transferred and processed in the presence and absence of the screen filter plate and then analyzed by LC-MS/MS methods. Our results showed a good agreement between the samples prepared with and without the screen filter plate, demonstrating the utility and efficiency of this novel device for preparation of blood and plasma samples. The device is simple, easy to use, and reusable. It can be employed for sample preparation of other biological fluids that contain floating particulates or aggregates.

  16. Analysis of Saccharina japonica transcriptome using the high-throughput DNA sequencing technique and its vanadium-dependent haloperoxidase gene

    Institute of Scientific and Technical Information of China (English)

    LIANG Xiayuan; WANG Xumin; CHI Shan; WU Shuangxiu; SUN Jing; LIU Cui; CHEN Shengping; YU Jun; LIU Tao

    2014-01-01

    Saccharina is one of the most important cold-water living marine brown algal genera. In this study we ana-lyzed the transcriptome of S. japonica, which belongs to the 1 000 Plants (OneKP) Project, by using a next-generation high-throughput DNA sequencing technique. About 5.16 GB of raw data were generated, and 65 536 scaffolds with an average length of 454 bp were assembled with SOAP de novo assembly method. In total, 19 040 unigenes were identified by BLAST;25 734 scaffolds were clustered into 37 Gene ontology functional groups;6 760 scaffolds were classified into 25 COG categories, as well as 2 665 scaffolds that were assigned to 306 KEGG pathways. Majority of the unigenes exhibited more similarities to algae including brown algae and diatom than other cyanobacteria, marine diatom, and plant. Saccharina japonica has the outstanding capability to accumulate halogen such as Br and I via halogenation processes from seawater. We acquired 42 different vanadium-dependent haloperoxidases (vHPO) in S. japonica transcriptome data, including 5 segments of vanadium-dependent iodoperoxidase (vIPO) and 37 segments of vanadium-de-pendent bromoperoxidase (vBPO). Complicated analyses of identified fulllength S. japonica vBPO1 and S. japonica vBPO2 revealed the importance of vBPO among species of brown algae and the strong relationship between marine algal vBPOs and vIPOs. This study will enhance our understanding of the biological charac-teristics and economic values of S. japonica species.

  17. An UPLC-MS/MS method for highly sensitive high-throughput analysis of phytohormones in plant tissues

    Directory of Open Access Journals (Sweden)

    Balcke Gerd Ulrich

    2012-11-01

    Full Text Available Abstract Background Phytohormones are the key metabolites participating in the regulation of multiple functions of plant organism. Among them, jasmonates, as well as abscisic and salicylic acids are responsible for triggering and modulating plant reactions targeted against pathogens and herbivores, as well as resistance to abiotic stress (drought, UV-irradiation and mechanical wounding. These factors induce dramatic changes in phytohormone biosynthesis and transport leading to rapid local and systemic stress responses. Understanding of underlying mechanisms is of principle interest for scientists working in various areas of plant biology. However, highly sensitive, precise and high-throughput methods for quantification of these phytohormones in small samples of plant tissues are still missing. Results Here we present an LC-MS/MS method for fast and highly sensitive determination of jasmonates, abscisic and salicylic acids. A single-step sample preparation procedure based on mixed-mode solid phase extraction was efficiently combined with essential improvements in mobile phase composition yielding higher efficiency of chromatographic separation and MS-sensitivity. This strategy resulted in dramatic increase in overall sensitivity, allowing successful determination of phytohormones in small (less than 50 mg of fresh weight tissue samples. The method was completely validated in terms of analyte recovery, sensitivity, linearity and precision. Additionally, it was cross-validated with a well-established GC-MS-based procedure and its applicability to a variety of plant species and organs was verified. Conclusion The method can be applied for the analyses of target phytohormones in small tissue samples obtained from any plant species and/or plant part relying on any commercially available (even less sensitive tandem mass spectrometry instrumentation.

  18. High-throughput pseudovirion-based neutralization assay for analysis of natural and vaccine-induced antibodies against human papillomaviruses.

    Directory of Open Access Journals (Sweden)

    Peter Sehr

    Full Text Available A highly sensitive, automated, purely add-on, high-throughput pseudovirion-based neutralization assay (HT-PBNA with excellent repeatability and run-to-run reproducibility was developed for human papillomavirus types (HPV 16, 18, 31, 45, 52, 58 and bovine papillomavirus type 1. Preparation of 384 well assay plates with serially diluted sera and the actual cell-based assay are separated in time, therefore batches of up to one hundred assay plates can be processed sequentially. A mean coefficient of variation (CV of 13% was obtained for anti-HPV 16 and HPV 18 titers for a standard serum tested in a total of 58 repeats on individual plates in seven independent runs. Natural antibody response was analyzed in 35 sera from patients with HPV 16 DNA positive cervical intraepithelial neoplasia grade 2+ lesions. The new HT-PBNA is based on Gaussia luciferase with increased sensitivity compared to the previously described manual PBNA (manPBNA based on secreted alkaline phosphatase as reporter. Titers obtained with HT-PBNA were generally higher than titers obtained with the manPBNA. A good linear correlation (R(2 = 0.7 was found between HT-PBNA titers and anti-HPV 16 L1 antibody-levels determined by a Luminex bead-based GST-capture assay for these 35 sera and a Kappa-value of 0.72, with only 3 discordant sera in the low titer range. In addition to natural low titer antibody responses the high sensitivity of the HT-PBNA also allows detection of cross-neutralizing antibodies induced by commercial HPV L1-vaccines and experimental L2-vaccines. When analyzing the WHO international standards for HPV 16 and 18 we determined an analytical sensitivity of 0.864 and 1.105 mIU, respectively.

  19. The logic of EGFR/ErbB signaling: theoretical properties and analysis of high-throughput data.

    Directory of Open Access Journals (Sweden)

    Regina Samaga

    2009-08-01

    Full Text Available The epidermal growth factor receptor (EGFR signaling pathway is probably the best-studied receptor system in mammalian cells, and it also has become a popular example for employing mathematical modeling to cellular signaling networks. Dynamic models have the highest explanatory and predictive potential; however, the lack of kinetic information restricts current models of EGFR signaling to smaller sub-networks. This work aims to provide a large-scale qualitative model that comprises the main and also the side routes of EGFR/ErbB signaling and that still enables one to derive important functional properties and predictions. Using a recently introduced logical modeling framework, we first examined general topological properties and the qualitative stimulus-response behavior of the network. With species equivalence classes, we introduce a new technique for logical networks that reveals sets of nodes strongly coupled in their behavior. We also analyzed a model variant which explicitly accounts for uncertainties regarding the logical combination of signals in the model. The predictive power of this model is still high, indicating highly redundant sub-structures in the network. Finally, one key advance of this work is the introduction of new techniques for assessing high-throughput data with logical models (and their underlying interaction graph. By employing these techniques for phospho-proteomic data from primary hepatocytes and the HepG2 cell line, we demonstrate that our approach enables one to uncover inconsistencies between experimental results and our current qualitative knowledge and to generate new hypotheses and conclusions. Our results strongly suggest that the Rac/Cdc42 induced p38 and JNK cascades are independent of PI3K in both primary hepatocytes and HepG2. Furthermore, we detected that the activation of JNK in response to neuregulin follows a PI3K-dependent signaling pathway.

  20. Cost-Effectiveness Analysis in Practice: Interventions to Improve High School Completion

    Science.gov (United States)

    Hollands, Fiona; Bowden, A. Brooks; Belfield, Clive; Levin, Henry M.; Cheng, Henan; Shand, Robert; Pan, Yilin; Hanisch-Cerda, Barbara

    2014-01-01

    In this article, we perform cost-effectiveness analysis on interventions that improve the rate of high school completion. Using the What Works Clearinghouse to select effective interventions, we calculate cost-effectiveness ratios for five youth interventions. We document wide variation in cost-effectiveness ratios between programs and between…

  1. Leveraging the power of high performance computing for next generation sequencing data analysis: tricks and twists from a high throughput exome workflow.

    Science.gov (United States)

    Kawalia, Amit; Motameny, Susanne; Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files.

  2. Leveraging the power of high performance computing for next generation sequencing data analysis: tricks and twists from a high throughput exome workflow.

    Directory of Open Access Journals (Sweden)

    Amit Kawalia

    Full Text Available Next generation sequencing (NGS has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files.

  3. Renal transplantation vs hemodialysis: Cost-effectiveness analysis

    Directory of Open Access Journals (Sweden)

    Perović Saša

    2009-01-01

    Full Text Available Background/Aim. Chronic renal insufficiency (CRI, diabetes, hypertension, autosomal dominant polycystic kidney disease (ADPKD are the main reasons for starting dialysis treatment in patients having kidney function failure. At present, dialysis treatments are performed in about 4,100 patients at 46 institutions in Serbia, out of which 90% are hemodialyses. At end-stage renal disease (ESRD the only correct selection is kidney transplatation. The basic aim of the planned research was to compare ratio of costs and effects (Cost Effectiveness Analysis - CEA of hemodialysis and kidney transplantation in patients at ESRD. Methods. As the main issue of treatment in patients from both groups the life quality measured by the validated McGill Questionary, was used. The study included 150 patients totally, divided into two groups. The study group consisted of 50 patients with kidney transplantation performed at the Clinical Center of Serbia and the control group consisted of 100 patients on hemodialysis at Clinical Center of Serbia, Clinical Hospital Center Zemun, Clinical Hospital Center 'Zvezdara', Clinical Center Kragujevac and Health Center 'Studenica', Kraljevo, comparable with respect to sex, age and length of treatment with the study group. Results. Effect of kidney transplantation in relation to hemodialysis being selection of treatment is expressed in the form of incremental ratio of costs and effects (Incremental Cost-Effectiveness Ratio - ICER. It is clear from the enclosed tables that the strategy of kidney transplantation is far more profitable considering the fact that it represents saving of EUR 132,256.25 per one year of contribution Quality Adjusted Life Years (QALY within the period of 10 years. According to all aspects of live quality (physical symptoms and problems, physical well-being, psychological symptoms, existential well-being and support, difference is statistically important in favor of transplant patents. Conclusion. The costs

  4. Cost-Effectiveness Analysis of Infrapopliteal Drug-Eluting Stents

    Energy Technology Data Exchange (ETDEWEB)

    Katsanos, Konstantinos, E-mail: katsanos@med.upatras.gr; Karnabatidis, Dimitris; Diamantopoulos, Athanasios; Spiliopoulos, Stavros; Siablis, Dimitris [Patras University Hospital, Department of Interventional Radiology, School of Medicine (Greece)

    2013-02-15

    IntroductionThere are no cost-utility data about below-the-knee placement of drug-eluting stents. The authors determined the cost-effectiveness of infrapopliteal drug-eluting stents for critical limb ischemia (CLI) treatment. The event-free individual survival outcomes defined by the absence of any major events, including death, major amputation, and target limb repeat procedures, were reconstructed on the basis of two published infrapopliteal series. The first included spot Bail-out use of Sirolimus-eluting stents versus bare metal stents after suboptimal balloon angioplasty (Bail-out SES).The second was full-lesion Primary Everolimus-eluting stenting versus plain balloon angioplasty and bail-out bare metal stenting as necessary (primary EES). The number-needed-to-treat (NNT) to avoid one major event and incremental cost-effectiveness ratios (ICERs) were calculated for a 3-year postprocedural period for both strategies. Overall event-free survival was significantly improved in both strategies (hazard ratio (HR) [confidence interval (CI)]: 0.68 [0.41-1.12] in Bail-out SES and HR [CI]: 0.53 [0.29-0.99] in Primary EES). Event-free survival gain per patient was 0.89 (range, 0.11-3.0) years in Bail-out SES with an NNT of 4.6 (CI: 2.5-25.6) and a corresponding ICER of 6,518 Euro-Sign (range 1,685-10,112 Euro-Sign ). Survival gain was 0.91 (range 0.25-3.0) years in Primary EES with an NNT of 2.7 (CI: 1.7-5.8) and an ICER of 11,581 Euro-Sign (range, 4,945-21,428 Euro-Sign ) per event-free life-year gained. Two-way sensitivity analysis showed that stented lesion length >10 cm and/or DES list price >1000 Euro-Sign were associated with the least economically favorable scenario in both strategies. Both strategies of bail-out SES and primary EES placement in the infrapopliteal arteries for CLI treatment exhibit single-digit NNT and relatively low corresponding ICERs.

  5. Analysis of nuclear organization with TANGO, software for high-throughput quantitative analysis of 3D fluorescence microscopy images.

    Science.gov (United States)

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2015-01-01

    The cell nucleus is a highly organized cellular organelle that contains the genome. An important step to understand the relationships between genome positioning and genome functions is to extract quantitative data from three-dimensional (3D) fluorescence imaging. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here we present a practical approach using TANGO (Tools for Analysis of Nuclear Genome Organization), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a generic tool able to process large sets of images, allowing quantitative study of nuclear organization. In this chapter a practical description of the software is drawn in order to give an overview of its different concepts and functionalities. This description is illustrated with a precise example that can be performed step-by-step on experimental data provided on the website http://biophysique.mnhn.fr/tango/HomePage.

  6. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  7. Hospitalization for pelvic inflammatory disease: a cost-effectiveness analysis.

    Science.gov (United States)

    Smith, Kenneth J; Ness, Roberta B; Roberts, Mark S

    2007-02-01

    Nulliparous women are frequently hospitalized for treatment of pelvic inflammatory disease (PID). The goal of this study was to determine the economic feasibility of hospitalizing adolescents and young women for PID. The authors conducted a Markov decision model, estimating the cost-effectiveness of hospitalization compared with outpatient therapy for mild to moderate PID for adolescents and young women, calculating costs per quality-adjusted life-year (QALY) gained under various assumptions about hospitalization effects on complications. If hospitalization decreases PID complications by 10%, 20%, or 30%, the cost/QALY gained is 145,000 dollars, 67,400 dollars, or 42,400 dollars, respectively, compared with outpatient therapy. Assumptions about hospitalization effects on the development of chronic pelvic pain heavily weight the analysis; costs/QALY gained by hospitalization increase considerably if chronic pain is unaffected. Hospitalization for PID treatment to possibly preserve fertility in nulliparous young women and adolescents is unlikely to be economically reasonable even if substantial improvements in PID complication rates are assumed.

  8. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  9. Cost-Effectiveness Analysis of Family Planning Services Offered by ...

    African Journals Online (AJOL)

    USER

    Keywords: Mobile clinics; Staic clinic; Family planning; Cost-effectiveness. Résumé ... revealed surprisingly low use of mobile clinic services ... provider point of view. Cost data ..... this is an even more attractive strategy than tying free IUDs to ...

  10. Systemic cost-effectiveness analysis of food hazard reduction

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Lawson, Lartey Godwin; Lund, Mogens

    2015-01-01

    stage are considered. Cost analyses are conducted for different risk reduction targets and for three alternative scenarios concerning the acceptable range of interventions. Results demonstrate that using a system-wide policy approach to risk reduction can be more cost-effective than a policy focusing...

  11. [Cost-effectiveness analysis of professional oral hygiene].

    Science.gov (United States)

    Olesov, E E; Shaĭmieva, N I; Kononenko, V I; Bersanov, R U; Monakova, N E

    2014-01-01

    Periodontal status and oral hygiene indexes were studied in 125 young employee of Kurchatov Institute. Oral hygiene values dynamic was assessed after professional oral hygiene in persons with unsatisfactory oral hygiene at baseline examination. When compared with the same values in the absence of professional oral hygiene procedures the results allowed calculating cost-effectiveness rate for biannual professional oral hygiene.

  12. Automation of High-Throughput Mass Spectrometry-Based Plasma N-Glycome Analysis with Linkage-Specific Sialic Acid Esterification.

    Science.gov (United States)

    Bladergroen, Marco R; Reiding, Karli R; Hipgrave Ederveen, Agnes L; Vreeker, Gerda C M; Clerc, Florent; Holst, Stephanie; Bondt, Albert; Wuhrer, Manfred; van der Burgt, Yuri E M

    2015-09-04

    Glycosylation is a post-translational modification of key importance with heterogeneous structural characteristics. Previously, we have developed a robust, high-throughput MALDI-TOF-MS method for the comprehensive profiling of human plasma N-glycans. In this approach, sialic acid residues are derivatized with linkage-specificity, namely the ethylation of α2,6-linked sialic acid residues with parallel lactone formation of α2,3-linked sialic acids. In the current study, this procedure was used as a starting point for the automation of all steps on a liquid-handling robot system. This resulted in a time-efficient and fully standardized procedure with throughput times of 2.5 h for a first set of 96 samples and approximately 1 h extra for each additional sample plate. The mass analysis of the thus-obtained glycans was highly reproducible in terms of relative quantification, with improved interday repeatability as compared to that of manual processing.

  13. A high-throughput three-dimensional cell migration assay for toxicity screening with mobile device-based macroscopic image analysis

    Science.gov (United States)

    Timm, David M.; Chen, Jianbo; Sing, David; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Raphael, Robert M.; Dehghani, Mehdi; Rosenblatt, Kevin P.; Killian, T. C.; Tseng, Hubert; Souza, Glauco R.

    2013-10-01

    There is a growing demand for in vitro assays for toxicity screening in three-dimensional (3D) environments. In this study, 3D cell culture using magnetic levitation was used to create an assay in which cells were patterned into 3D rings that close over time. The rate of closure was determined from time-lapse images taken with a mobile device and related to drug concentration. Rings of human embryonic kidney cells (HEK293) and tracheal smooth muscle cells (SMCs) were tested with ibuprofen and sodium dodecyl sulfate (SDS). Ring closure correlated with the viability and migration of cells in two dimensions (2D). Images taken using a mobile device were similar in analysis to images taken with a microscope. Ring closure may serve as a promising label-free and quantitative assay for high-throughput in vivo toxicity in 3D cultures.

  14. Real-time and high-throughput analysis of mitochondrial metabolic states in living cells using genetically encoded NAD(+)/NADH sensors.

    Science.gov (United States)

    Zhao, Yuzheng; Yang, Yi

    2016-11-01

    Mitochondria are central organelles that regulate cellular bioenergetics, biosynthesis, and signaling processes. NADH, a key player in cell metabolism, is often considered as a marker of mitochondrial function. However, traditional methods for NADH measurements are either destructive or unable to distinguish between NADH and NADPH. In contrast to traditional methods, genetically encoded NADH sensors can be used for the real-time tracking and quantitative measurement of subcellular NADH levels in living cells. Therefore, these sensors provide innovative tools and address the limitations of current techniques. We herein summarize the properties of different types of recently developed NADH biosensors, discuss their advantages and disadvantages, and focus on the high-throughput analysis of mitochondrial function by using highly responsive NAD(+)/NADH sensors.

  15. Automated vector selection of SIVQ and parallel computing integration MATLAB TM : Innovations supporting large-scale and high-throughput image analysis studies

    Directory of Open Access Journals (Sweden)

    Jerome Cheng

    2011-01-01

    Full Text Available Introduction: Spatially invariant vector quantization (SIVQ is a texture and color-based image matching algorithm that queries the image space through the use of ring vectors. In prior studies, the selection of one or more optimal vectors for a particular feature of interest required a manual process, with the user initially stochastically selecting candidate vectors and subsequently testing them upon other regions of the image to verify the vector′s sensitivity and specificity properties (typically by reviewing a resultant heat map. In carrying out the prior efforts, the SIVQ algorithm was noted to exhibit highly scalable computational properties, where each region of analysis can take place independently of others, making a compelling case for the exploration of its deployment on high-throughput computing platforms, with the hypothesis that such an exercise will result in performance gains that scale linearly with increasing processor count. Methods: An automated process was developed for the selection of optimal ring vectors to serve as the predicate matching operator in defining histopathological features of interest. Briefly, candidate vectors were generated from every possible coordinate origin within a user-defined vector selection area (VSA and subsequently compared against user-identified positive and negative "ground truth" regions on the same image. Each vector from the VSA was assessed for its goodness-of-fit to both the positive and negative areas via the use of the receiver operating characteristic (ROC transfer function, with each assessment resulting in an associated area-under-the-curve (AUC figure of merit. Results: Use of the above-mentioned automated vector selection process was demonstrated in two cases of use: First, to identify malignant colonic epithelium, and second, to identify soft tissue sarcoma. For both examples, a very satisfactory optimized vector was identified, as defined by the AUC metric. Finally, as an

  16. Optical tissue clearing improves usability of optical coherence tomography (OCT) for high-throughput analysis of the internal structure and 3D morphology of small biological objects such as vertebrate embryos

    DEFF Research Database (Denmark)

    Thrane, Lars; Jørgensen, Thomas Martini; Männer, Jörg

    2014-01-01

    Developmental biology studies frequently require rapid analysis of the morphology of a large number of embryos (highthroughput analysis). Conventional microscopic analysis is time-consuming and, therefore, is not well suited for highthroughput analysis. OCT facilitates rapid generation of optical...... significantly reduce the light scattering and, thereby, improves the usability of OCT for high-throughput analysis of embryonic morphology....

  17. Uncovering leaf rust responsive miRNAs in wheat (Triticum aestivum L.) using high-throughput sequencing and prediction of their targets through degradome analysis.

    Science.gov (United States)

    Kumar, Dhananjay; Dutta, Summi; Singh, Dharmendra; Prabhu, Kumble Vinod; Kumar, Manish; Mukhopadhyay, Kunal

    2017-01-01

    Deep sequencing identified 497 conserved and 559 novel miRNAs in wheat, while degradome analysis revealed 701 targets genes. QRT-PCR demonstrated differential expression of miRNAs during stages of leaf rust progression. Bread wheat (Triticum aestivum L.) is an important cereal food crop feeding 30 % of the world population. Major threat to wheat production is the rust epidemics. This study was targeted towards identification and functional characterizations of micro(mi)RNAs and their target genes in wheat in response to leaf rust ingression. High-throughput sequencing was used for transcriptome-wide identification of miRNAs and their expression profiling in retort to leaf rust using mock and pathogen-inoculated resistant and susceptible near-isogenic wheat plants. A total of 1056 mature miRNAs were identified, of which 497 miRNAs were conserved and 559 miRNAs were novel. The pathogen-inoculated resistant plants manifested more miRNAs compared with the pathogen infected susceptible plants. The miRNA counts increased in susceptible isoline due to leaf rust, conversely, the counts decreased in the resistant isoline in response to pathogenesis illustrating precise spatial tuning of miRNAs during compatible and incompatible interaction. Stem-loop quantitative real-time PCR was used to profile 10 highly differentially expressed miRNAs obtained from high-throughput sequencing data. The spatio-temporal profiling validated the differential expression of miRNAs between the isolines as well as in retort to pathogen infection. Degradome analysis provided 701 predicted target genes associated with defense response, signal transduction, development, metabolism, and transcriptional regulation. The obtained results indicate that wheat isolines employ diverse arrays of miRNAs that modulate their target genes during compatible and incompatible interaction. Our findings contribute to increase knowledge on roles of microRNA in wheat-leaf rust interactions and could help in rust

  18. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Gang Xue

    2001-12-31

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10{sup -11} M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  19. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  20. Reference genes for high-throughput quantitative reverse transcription-PCR analysis of gene expression in organs and tissues of Eucalyptus grown in various environmental conditions.

    Science.gov (United States)

    Cassan-Wang, Hua; Soler, Marçal; Yu, Hong; Camargo, Eduardo Leal O; Carocha, Victor; Ladouce, Nathalie; Savelli, Bruno; Paiva, Jorge A P; Leplé, Jean-Charles; Grima-Pettenati, Jacqueline

    2012-12-01

    Interest in the genomics of Eucalyptus has skyrocketed thanks to the recent sequencing of the genome of Eucalyptus grandis and to a growing number of large-scale transcriptomic studies. Quantitative reverse transcription-PCR (RT-PCR) is the method of choice for gene expression analysis and can now also be used as a high-throughput method. The selection of appropriate internal controls is becoming of utmost importance to ensure accurate expression results in Eucalyptus. To this end, we selected 21 candidate reference genes and used high-throughput microfluidic dynamic arrays to assess their expression among a large panel of developmental and environmental conditions with a special focus on wood-forming tissues. We analyzed the expression stability of these genes by using three distinct statistical algorithms (geNorm, NormFinder and ΔCt), and used principal component analysis to compare methods and rankings. We showed that the most stable genes identified depended not only on the panel of biological samples considered but also on the statistical method used. We then developed a comprehensive integration of the rankings generated by the three methods and identified the optimal reference genes for 17 distinct experimental sets covering 13 organs and tissues, as well as various developmental and environmental conditions. The expression patterns of Eucalyptus master genes EgMYB1 and EgMYB2 experimentally validated our selection. Our findings provide an important resource for the selection of appropriate reference genes for accurate and reliable normalization of gene expression data in the organs and tissues of Eucalyptus trees grown in a range of conditions including abiotic stresses.

  1. An Automated High-Throughput Metabolic Stability Assay Using an Integrated High-Resolution Accurate Mass Method and Automated Data Analysis Software

    Science.gov (United States)

    Shah, Pranav; Kerns, Edward; Nguyen, Dac-Trung; Obach, R. Scott; Wang, Amy Q.; Zakharov, Alexey; McKew, John; Simeonov, Anton; Hop, Cornelis E. C. A.

    2016-01-01

    Advancement of in silico tools would be enabled by the availability of data for metabolic reaction rates and intrinsic clearance (CLint) of a diverse compound structure data set by specific metabolic enzymes. Our goal is to measure CLint for a large set of compounds with each major human cytochrome P450 (P450) isozyme. To achieve our goal, it is of utmost importance to develop an automated, robust, sensitive, high-throughput metabolic stability assay that can efficiently handle a large volume of compound sets. The substrate depletion method [in vitro half-life (t1/2) method] was chosen to determine CLint. The assay (384-well format) consisted of three parts: 1) a robotic system for incubation and sample cleanup; 2) two different integrated, ultraperformance liquid chromatography/mass spectrometry (UPLC/MS) platforms to determine the percent remaining of parent compound, and 3) an automated data analysis system. The CYP3A4 assay was evaluated using two long t1/2 compounds, carbamazepine and antipyrine (t1/2 > 30 minutes); one moderate t1/2 compound, ketoconazole (10 < t1/2 < 30 minutes); and two short t1/2 compounds, loperamide and buspirone (t½ < 10 minutes). Interday and intraday precision and accuracy of the assay were within acceptable range (∼12%) for the linear range observed. Using this assay, CYP3A4 CLint and t1/2 values for more than 3000 compounds were measured. This high-throughput, automated, and robust assay allows for rapid metabolic stability screening of large compound sets and enables advanced computational modeling for individual human P450 isozymes. PMID:27417180

  2. Cost-Effectiveness Analysis of Various Methods of Instruction in Developmental Mathematics.

    Science.gov (United States)

    Carman, Robert A.

    This paper examined in a critical fashion the existing applications of cost-effectiveness analysis in education, particularly the study of instructional effectiveness in the community college. Various schemes for measuring costs of instruction such as cost benefit analysis, cost-effectiveness analysis and planning programming budgeting systems…

  3. Cost-Effectiveness Analysis of Early Reading Programs: A Demonstration with Recommendations for Future Research

    Science.gov (United States)

    Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.

    2016-01-01

    We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…

  4. Transcriptome-Wide Analysis of Botrytis elliptica Responsive microRNAs and Their Targets in Lilium Regale Wilson by High-Throughput Sequencing and Degradome Analysis

    Directory of Open Access Journals (Sweden)

    Xue Gao

    2017-05-01

    Full Text Available MicroRNAs, as master regulators of gene expression, have been widely identified and play crucial roles in plant-pathogen interactions. A fatal pathogen, Botrytis elliptica, causes the serious folia disease of lily, which reduces production because of the high susceptibility of most cultivated species. However, the miRNAs related to Botrytis infection of lily, and the miRNA-mediated gene regulatory networks providing resistance to B. elliptica in lily remain largely unexplored. To systematically dissect B. elliptica-responsive miRNAs and their target genes, three small RNA libraries were constructed from the leaves of Lilium regale, a promising Chinese wild Lilium species, which had been subjected to mock B. elliptica treatment or B. elliptica infection for 6 and 24 h. By high-throughput sequencing, 71 known miRNAs belonging to 47 conserved families and 24 novel miRNA were identified, of which 18 miRNAs were downreguleted and 13 were upregulated in response to B. elliptica. Moreover, based on the lily mRNA transcriptome, 22 targets for 9 known and 1 novel miRNAs were identified by the degradome sequencing approach. Most target genes for elliptica-responsive miRNAs were involved in metabolic processes, few encoding different transcription factors, including ELONGATION FACTOR 1 ALPHA (EF1a and TEOSINTE BRANCHED1/CYCLOIDEA/PROLIFERATING CELL FACTOR 2 (TCP2. Furthermore, the expression patterns of a set of elliptica-responsive miRNAs and their targets were validated by quantitative real-time PCR. This study represents the first transcriptome-based analysis of miRNAs responsive to B. elliptica and their targets in lily. The results reveal the possible regulatory roles of miRNAs and their targets in B. elliptica interaction, which will extend our understanding of the mechanisms of this disease in lily.

  5. High-throughput single-cell analysis of low copy number β-galactosidase by a laboratory-built high-sensitivity flow cytometer.

    Science.gov (United States)

    Yang, Lingling; Huang, Tianxun; Zhu, Shaobin; Zhou, Yingxing; Jiang, Yunbin; Wang, Shuo; Chen, Yuqing; Wu, Lina; Yan, Xiaomei

    2013-10-15

    Single-cell analysis is vital in providing insights into the heterogeneity in molecular content and phenotypic characteristics of complex or clonal cell populations. As many essential proteins and most transcription factors are produced at a low copy number, analytical tools with superior sensitivity to enable the analysis of low abundance proteins in single cells are in high demand. β-galactosidase (β-gal) has been the standard cellular reporter for gene expression in both prokaryotic and eukaryotic cells. Here we report the development of a high-throughput method for the single-cell analysis of low copy number β-gal proteins using a laboratory-built high-sensitivity flow cytometer (HSFCM). Upon fluorescence staining with a fluorogenic substrate, quantitative measurements of the basal and near-basal expression of β-gal in single Escherichia coli BL21(DE3) cells were demonstrated. Statistical distribution can be determined quickly by analyzing thousands of individual cells in 1-2min, which reveals the heterogeneous expression pattern that is otherwise masked by the ensemble analysis. Combined with the quantitative fluorometric assay and the rapid bacterial enumeration by HSFCM, the β-gal expression distribution profile could be converted from arbitrary fluorescence units to protein copy numbers per cell. The sensitivity and speed of the HSFCM offers great capability in quantitative analysis of low abundance proteins in single cells, which would help gaining a deeper insight into the heterogeneity and fundamental biological processes in microbial populations.

  6. State-of-the-art technologies for rapid and high-throughput sample preparation and analysis of N-glycans from antibodies.

    Science.gov (United States)

    Aich, Udayanath; Lakbub, Jude; Liu, Aston

    2016-06-01

    Glycosylation is a PTM that occurs during production of many protein-based biologic drugs and can have a profound impact on their biological, clinical, and pharmacological properties. Quality by design, process optimization, and advance in manufacturing technology create a demand for robust, sensitive, and accurate profiling and quantification of antibody glycosylation. Potential drawbacks in antibody glycosylation profiling include the high hands-on time required for sample preparation and several hours for data acquisition and analysis. Rapid and high-throughput (HTP) N-glycan profiling and characterization along with automation for sample preparation and analysis are essential for extensive antibody glycosylation analysis due to the substantial improvement of turnaround time. The first part of this review article will focus on the recent progress in rapid and HTP sample preparation and analysis of antibody glycosylation. Subsequently, the article will cover a brief overview of various separation and mass spectrometric methods for the rapid and HTP analysis of N-glycans in antibodies. Finally, we will discuss the recent developments in process analytical technologies for the screening and quantification of N-glycans in antibodies.

  7. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  8. Development of a quality, high throughput DNA analysis procedure for skeletal samples to assist with the identification of victims from the World Trade Center attacks.

    Science.gov (United States)

    Holland, Mitchell M; Cave, Christopher A; Holland, Charity A; Bille, Todd W

    2003-06-01

    The attacks on the World Trade Center (WTC) Towers on September 11, 2001, represented the single largest terrorist-related mass fatality incident in the history of the United States. More than 2,700 individuals of varied racial and ethnic background lost their lives that day. Through the efforts of thousands of citizens, including recovery workers, medical examiners, and forensic scientists, the identification of approximately 1,500 victims had been accomplished through June 2003 (the majority of these identifications were made within the first 8-12 months). The principal role of The Bode Technology Group (Bode) in this process was to develop a quality, high throughput DNA extraction and short tandem repeat (STR) analysis procedure for skeletal elements, and to provide STR profiles to the Office of the Chief Medical Examiner (OCME) in New York City to be used for identification of the victims. A high throughput process was developed to include electronic accessioning of samples, so that the numbering system of the OCME was maintained; rapid preparation and sampling of skeletal fragments to allow for the processing of more than 250 fragments per day; use of a 96-well format for sample extraction, DNA quantification, and STR analysis; and use of the Applied Biosystems 3100 and 3700 instrumentation to develop STR profiles. Given the highly degraded nature of the skeletal remains received by Bode, an advanced DNA extraction procedure was developed to increase the quantity of DNA recovery and reduce the co-purification of polymerase chain reaction (PCR) amplification inhibitors. In addition, two new STR multiplexes were developed specifically for this project, which reduced the amplicon size of the STR loci, and therefore, enhanced the ability to obtain results from the most challenged of samples. In all, the procedures developed allowed for the analysis of more than 1,000 skeletal samples each week. Approximately 13,000 skeletal fragments were analyzed at least once

  9. A case study for cloud based high throughput analysis of NGS data using the globus genomics system.

    Science.gov (United States)

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the "Globus Genomics" system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  10. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    Directory of Open Access Journals (Sweden)

    Krithika Bhuvaneshwar

    2015-01-01

    Full Text Available Next generation sequencing (NGS technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  11. [Cost-effectiveness analysis on colorectal cancer screening program].

    Science.gov (United States)

    Huang, Q C; Ye, D; Jiang, X Y; Li, Q L; Yao, K Y; Wang, J B; Jin, M J; Chen, K

    2017-01-10

    Objective: To evaluate the cost-effectiveness of colorectal cancer screening program in different age groups from the view of health economics. Methods: The screening compliance rates, detection rates in different age groups were calculated by using the data from colorectal cancer screening program in Jiashan county, Zhejiang province. The differences in indicator among age groups were analyzed with χ(2) test or trend χ(2) test. The ratios of cost to the number of case were calculated according to cost statistics. Results: The detection rates of immunochemical fecal occult blood test (iFOBT) positivity, advanced adenoma and colorectal cancer and early stage cancer increased with age, while the early diagnosis rates were negatively associated with age. After exclusion the younger counterpart, the cost-effectiveness of individuals aged >50 years could be reduced by 15%-30%. Conclusion: From health economic perspective, it is beneficial to start colorectal cancer screening at age of 50 years to improve the efficiency of the screening.

  12. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique: a review.

    Science.gov (United States)

    Xiao, Li; Wei, Hui; Himmel, Michael E; Jameel, Hasan; Kelley, Stephen S

    2014-01-01

    Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR) and pyrolysis-molecular beam mass spectrometry (Py-mbms) are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis) and for building regression models (partial least square regression) between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated. This review

  13. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique : a review

    Directory of Open Access Journals (Sweden)

    Li eXiao

    2014-08-01

    Full Text Available Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR and pyrolysis-molecular beam mass spectrometry (Py-mbms are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis and for building regression models (partial least square regression between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated

  14. Systematic Analysis of the Association between Gut Flora and Obesity through High-Throughput Sequencing and Bioinformatics Approaches

    Directory of Open Access Journals (Sweden)

    Chih-Min Chiu

    2014-01-01

    Full Text Available Eighty-one stool samples from Taiwanese were collected for analysis of the association between the gut flora and obesity. The supervised analysis showed that the most, abundant genera of bacteria in normal samples (from people with a body mass index (BMI ≤ 24 were Bacteroides (27.7%, Prevotella (19.4%, Escherichia (12%, Phascolarctobacterium (3.9%, and Eubacterium (3.5%. The most abundant genera of bacteria in case samples (with a BMI ≥ 27 were Bacteroides (29%, Prevotella (21%, Escherichia (7.4%, Megamonas (5.1%, and Phascolarctobacterium (3.8%. A principal coordinate analysis (PCoA demonstrated that normal samples were clustered more compactly than case samples. An unsupervised analysis demonstrated that bacterial communities in the gut were clustered into two main groups: N-like and OB-like groups. Remarkably, most normal samples (78% were clustered in the N-like group, and most case samples (81% were clustered in the OB-like group (Fisher’s P  value=1.61E-07. The results showed that bacterial communities in the gut were highly associated with obesity. This is the first study in Taiwan to investigate the association between human gut flora and obesity, and the results provide new insights into the correlation of bacteria with the rising trend in obesity.

  15. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    Energy Technology Data Exchange (ETDEWEB)

    Harding, Louisa B. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Schultz, Irvin R. [Battelle, Marine Sciences Laboratory – Pacific Northwest National Laboratory, 1529 West Sequim Bay Road, Sequim, WA 98382 (United States); Goetz, Giles W. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Luckenbach, J. Adam [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Young, Graham [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Goetz, Frederick W. [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, Manchester Research Station, P.O. Box 130, Manchester, WA 98353 (United States); Swanson, Penny, E-mail: penny.swanson@noaa.gov [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States)

    2013-10-15

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina{sup ®} sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  16. Protocol: A high-throughput DNA extraction system suitable for conifers

    Directory of Open Access Journals (Sweden)

    Rajora Om P

    2008-08-01

    Full Text Available Abstract Background High throughput DNA isolation from plants is a major bottleneck for most studies requiring large sample sizes. A variety of protocols have been developed for DNA isolation from plants. However, many species, including conifers, have high contents of secondary metabolites that interfere with the extraction process or the subsequent analysis steps. Here, we describe a procedure for high-throughput DNA isolation from conifers. Results We have developed a high-throughput DNA extraction protocol for conifers using an automated liquid handler and modifying the Qiagen MagAttract Plant Kit protocol. The modifications involve change to the buffer system and improving the protocol so that it almost doubles the number of samples processed per kit, which significantly reduces the overall costs. We describe two versions of the protocol: one for medium-throughput (MTP and another for high-throughput (HTP DNA isolation. The HTP version works from start to end in the industry-standard 96-well format, while the MTP version provides higher DNA yields per sample processed. We have successfully used the protocol for DNA extraction and genotyping of thousands of individuals of several spruce and a pine species. Conclusion A high-throughput system for DNA extraction from conifer needles and seeds has been developed and validated. The quality of the isolated DNA was comparable with that obtained from two commonly used methods: the silica-spin column and the classic CTAB protocol. Our protocol provides a fully automatable and cost effective solution for processing large numbers of conifer samples.

  17. High-Throughput Live-Cell Microscopy Analysis of Association Between Chromosome Domains and the Nucleolus in S. cerevisiae.

    Science.gov (United States)

    Wang, Renjie; Normand, Christophe; Gadal, Olivier

    2016-01-01

    Spatial organization of the genome has important impacts on all aspects of chromosome biology, including transcription, replication, and DNA repair. Frequent interactions of some chromosome domains with specific nuclear compartments, such as the nucleolus, are now well documented using genome-scale methods. However, direct measurement of distance and interaction frequency between loci requires microscopic observation of specific genomic domains and the nucleolus, followed by image analysis to allow quantification. The fluorescent repressor operator system (FROS) is an invaluable method to fluorescently tag DNA sequences and investigate chromosome position and dynamics in living cells. This chapter describes a combination of methods to define motion and region of confinement of a locus relative to the nucleolus in cell's nucleus, from fluorescence acquisition to automated image analysis using two dedicated pipelines.

  18. Robust-LongSAGE (RL-SAGE): an improved LongSAGE method for high-throughput transcriptome analysis.

    Science.gov (United States)

    Gowda, Malali; Wang, Guo-Liang

    2008-01-01

    Serial analysis of gene expression (SAGE) is a powerful technique for large-scale transcriptome analysis in eukaryotes. However, technical difficulties in the SAGE library construction, such as low concatemer cloning efficiency, small concatemer size, and a high level of empty clones, has prohibited its widespread use as a routine technique for expression profiling in many laboratories. We recently improved the LongSAGE library construction method considerably and developed a modified version called Robust-LongSAGE, or RL-SAGE. In RL-SAGE, concatemer cloning efficiency and clone insert size were increased significantly. About 20 PCR reactions are sufficient to make a library with more than 150,000 clones. Using RL-SAGE, we have made 10 libraries of rice, maize, and the rice blast fungus Magnaporthe grisea.

  19. A high-throughput SNP marker system for parental polymorphism screening, and diversity analysis in common bean (Phaseolus vulgaris L.).

    Science.gov (United States)

    Blair, Matthew W; Cortés, Andrés J; Penmetsa, R Varma; Farmer, Andrew; Carrasquilla-Garcia, Noelia; Cook, Doug R

    2013-02-01

    Single nucleotide polymorphism (SNP) detection has become a marker system of choice, because of the high abundance of source polymorphisms and the ease with which allele calls are automated. Various technologies exist for the evaluation of SNP loci and previously we validated two medium throughput technologies. In this study, our goal was to utilize a 768 feature, Illumina GoldenGate assay for common bean (Phaseolus vulgaris L.) developed from conserved legume gene sequences and to use the new technology for (1) the evaluation of parental polymorphisms in a mini-core set of common bean accessions and (2) the analysis of genetic diversity in the crop. A total of 736 SNPs were scored on 236 diverse common bean genotypes with the GoldenGate array. Missing data and heterozygosity levels were low and 94 % of the SNPs were scorable. With the evaluation of the parental polymorphism genotypes, we estimated the utility of the SNP markers in mapping for inter-genepool and intra-genepool populations, the latter being of lower polymorphism than the former. When we performed the diversity analysis with the diverse genotypes, we found Illumina GoldenGate SNPs to provide equivalent evaluations as previous gene-based SNP markers, but less fine-distinctions than with previous microsatellite marker analysis. We did find, however, that the gene-based SNPs in the GoldenGate array had some utility in race structure analysis despite the low polymorphism. Furthermore the SNPs detected high heterozygosity in wild accessions which was probably a reflection of ascertainment bias. The Illumina SNPs were shown to be effective in distinguishing between the genepools, and therefore were most useful in saturation of inter-genepool genetic maps. The implications of these results for breeding in common bean are discussed as well as the advantages and disadvantages of the GoldenGate system for SNP detection.

  20. Detecting robust gene signature through integrated analysis of multiple types of high-throughput data in liver cancer

    Institute of Scientific and Technical Information of China (English)

    Xin-yu ZHANG; Tian-tian LI; Xiang-jun LIU

    2007-01-01

    Aim: To investigate the robust gene signature in liver cancer, we applied an integrated approach to perform a joint analysis of a highly diverse collection of liver cancer genome-wide datasets, including genomic alterations and transcrip- tion profiles. Methods: 1-class Significance Analysis of Microarrays coupled with ranking score method were used to identify the robust gene signature in liver tumor tissue. Results: In total, 1 625 051 gene expression measurements from 16 public microarrays, 2 pairs of serial analyses of gene expression experiments, and 252 loss of heterozygosity reports obtained from 568 publications were used in this integrated study. The resulting robust gene signatures included 90 genes, which may be of great importance to liver cancer research. A system assessment analysis revealed that our integrative method had an accuracy of 92% and a correlation coefficient value of 0.88. Conclusion: The system assessment results indicated that our method had the ability of integrating the datasets from various types of sources, and eliciting more accurate results, as can be very useful in the study of liver cancer.

  1. Cheetah: software for high-throughput reduction and analysis of serial femtosecond X-ray diffraction data.

    Science.gov (United States)

    Barty, Anton; Kirian, Richard A; Maia, Filipe R N C; Hantke, Max; Yoon, Chun Hong; White, Thomas A; Chapman, Henry

    2014-06-01

    The emerging technique of serial X-ray diffraction, in which diffraction data are collected from samples flowing across a pulsed X-ray source at repetition rates of 100 Hz or higher, has necessitated the development of new software in order to handle the large data volumes produced. Sorting of data according to different criteria and rapid filtering of events to retain only diffraction patterns of interest results in significant reductions in data volume, thereby simplifying subsequent data analysis and management tasks. Meanwhile the generation of reduced data in the form of virtual powder patterns, radial stacks, histograms and other meta data creates data set summaries for analysis and overall experiment evaluation. Rapid data reduction early in the analysis pipeline is proving to be an essential first step in serial imaging experiments, prompting the authors to make the tool described in this article available to the general community. Originally developed for experiments at X-ray free-electron lasers, the software is based on a modular facility-independent library to promote portability between different experiments and is available under version 3 or later of the GNU General Public License.

  2. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    Science.gov (United States)

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.

  3. Minimization of carryover for high-throughput liquid chromatography with tandem mass spectrometry analysis of 14 mycotoxins in corn grits.

    Science.gov (United States)

    Tamura, Masayoshi; Matsumoto, Keiko; Watanabe, Jun; Iida, Junko; Nagatomi, Yasushi; Mochizuki, Naoki

    2014-07-01

    A method for the simultaneous analysis of 14 mycotoxins with the minimization of carryover was developed. Our verification experiments suggested that the carryover occurred due to the chelation of fumonisins with the metal. To wash the fumonisins from the metal, the inner surface of the injection needle was rinsed with 10 mM trisodium citrate and 1% formic acid in water/methanol/acetonitrile/isopropanol after each injection, and the analysis was performed on a metal-free Mastro C18 column. This approach remarkably minimized the carryover of fumonisins. Fourteen mycotoxins in samples were extracted with 2% acetic acid in water/acetonitrile and a quick, easy, cheap, effective, rugged, and safe extraction kit, purified on a MultiSep 229 Ochra, and then quantified by liquid chromatography with tandem mass spectrometry. Determinations performed using this method produced a linearity greater than 0.99 and recoveries ranging from 72.6 to 117.4%, with good intraday precision from 4.0 to 12.4%, and interday precision from 6.5 to 17.0%. The limits of detection ranged from 0.01 to 0.71 μg/kg, demonstrating that a highly sensitive method for the simultaneous analysis of mycotoxins over a wide range of concentrations was achieved with minimal carryover. When 12 samples of commercially available corn grits were analyzed with this method, deoxynivalenol, fumonisin B1, fumonisin B2, fumonisin B3, and zearalenone were present most frequently.

  4. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  5. High-throughput GC-ECD analysis of PCBs in food by accelerated solvent extraction. Method validation

    Energy Technology Data Exchange (ETDEWEB)

    Piersanti, A.; Fioroni, L.; Paoloni, A.; Tavoloni, T.; Pecorelli, I.; Galarini, R. [Istituto Zooprofilattico Sperimentale dell' Umbria e delle Marche, Perugia (Italy)

    2004-09-15

    In the year 2000 the determination of the PCBs in food commodities was introduced in the Italian national residue control plan in which government labs were requested to estimate the total PCB content as sum of seven more representative congeners. Later on, in 2001, it was decided that a more appropriate estimation of the total PCBs was possible through analysis of eighteen rather than seven congeners. Therefore the need for simple and validated analytical methods arose. In this work a method for the analysis of the PCBs 18-congeners (T{sub 3}CB-28, T{sub 4}CB-52, P{sub 5}CB-95, P{sub 5}CB-99, P{sub 5}CB-101, P{sub 5}CB-105, P{sub 5}CB-110, P{sub 5}CB-118, H{sub 6}CB-138, H{sub 6}CB-146, H{sub 6}CB-149, H{sub 6}CB-151, H{sub 6}CB-153, H{sub 7}CB-170, H{sub 7}CB-177, H{sub 7}CB-180, H{sub 7}CB-183, H{sub 7}CB-187) is reported. This has been set up taking in account the advantages of the automated and high efficient Accelerated Solvent Extraction together with good purification achieved by a one-step acidic-extrelut/silica chromatography. The instrumental analysis is performed by capillary-GC equipped with an ECD detector. An in-house validation study has been made on swine muscle assessing the method performances in terms of limit of detection, response linearity range, trueness and precision.

  6. High-Throughput Analysis of Age-Dependent Protein Changes in Layer II/III of the Human Orbitofrontal Cortex

    Science.gov (United States)

    Kapadia, Fenika

    Studies on the orbitofrontal cortex (OFC) during normal aging have shown a decline in cognitive functions, a loss of spines/synapses in layer III and gene expression changes related to neural communication. Biological changes during the course of normal aging are summarized into 9 hallmarks based on aging in peripheral tissue. Whether these hallmarks apply to non-dividing brain tissue is not known. Therefore, we opted to perform large-scale proteomic profiling of the OFC layer II/III during normal aging from 15 young and 18 old male subjects. MaxQuant was utilized for label-free quantification and statistical analysis by the Random Intercept Model (RIM) identified 118 differentially expressed (DE) age-related proteins. Altered neural communication was the most represented hallmark of aging (54% of DE proteins), highlighting the importance of communication in the brain. Functional analysis showed enrichment in GABA/glutamate signaling and pro-inflammatory responses. The former may contribute to alterations in excitation/inhibition, leading to cognitive decline during aging.

  7. Universal HIV screening of pregnant women in England : cost effectiveness analysis

    NARCIS (Netherlands)

    Postma, Maarten; Beck, E J; Mandalia, S; Sherr, L; Walters, M D; Houweling, H; Jager, Johannes C

    1999-01-01

    OBJECTIVE: To estimate the cost effectiveness of universal, voluntary HIV screening of pregnant women in England. DESIGN: Cost effectiveness analysis. Cost estimates of caring for HIV positive children were based on the stage of HIV infection and calculated using data obtained from a London hospital

  8. Implementing a Cost Effectiveness Analyzer for Web-Supported Academic Instruction: A Campus Wide Analysis

    Science.gov (United States)

    Cohen, Anat; Nachmias, Rafi

    2009-01-01

    This paper describes the implementation of a quantitative cost effectiveness analyzer for Web-supported academic instruction that was developed in Tel Aviv University during a long term study. The paper presents the cost effectiveness analysis of Tel Aviv University campus. Cost and benefit of 3,453 courses were analyzed, exemplifying campus-wide…

  9. Universal HIV screening of pregnant women in England : cost effectiveness analysis

    NARCIS (Netherlands)

    Postma, Maarten; Beck, E J; Mandalia, S; Sherr, L; Walters, M D; Houweling, H; Jager, Johannes C

    1999-01-01

    OBJECTIVE: To estimate the cost effectiveness of universal, voluntary HIV screening of pregnant women in England. DESIGN: Cost effectiveness analysis. Cost estimates of caring for HIV positive children were based on the stage of HIV infection and calculated using data obtained from a London hospital

  10. The genetic diversity and evolution of field pea (Pisum studied by high throughput retrotransposon based insertion polymorphism (RBIP marker analysis

    Directory of Open Access Journals (Sweden)

    Smýkal Petr

    2010-02-01

    Full Text Available Abstract Background The genetic diversity of crop species is the result of natural selection on the wild progenitor and human intervention by ancient and modern farmers and breeders. The genomes of modern cultivars, old cultivated landraces, ecotypes and wild relatives reflect the effects of these forces and provide insights into germplasm structural diversity, the geographical dimension to species diversity and the process of domestication of wild organisms. This issue is also of great practical importance for crop improvement because wild germplasm represents a rich potential source of useful under-exploited alleles or allele combinations. The aim of the present study was to analyse a major Pisum germplasm collection to gain a broad understanding of the diversity and evolution of Pisum and provide a new rational framework for designing germplasm core collections of the genus. Results 3020 Pisum germplasm samples from the John Innes Pisum germplasm collection were genotyped for 45 retrotransposon based insertion polymorphism (RBIP markers by the Tagged Array Marker (TAM method. The data set was stored in a purpose-built Germinate relational database and analysed by both principal coordinate analysis and a nested application of the Structure program which yielded substantially similar but complementary views of the diversity of the genus Pisum. Structure revealed three Groups (1-3 corresponding approximately to landrace, cultivar and wild Pisum respectively, which were resolved by nested Structure analysis into 14 Sub-Groups, many of which correlate with taxonomic sub-divisions of Pisum, domestication related phenotypic traits and/or restricted geographical locations. Genetic distances calculated between these Sub-Groups are broadly supported by principal coordinate analysis and these, together with the trait and geographical data, were used to infer a detailed model for the domestication of Pisum. Conclusions These data provide a clear picture

  11. Foundations of data-intensive science: Technology and practice for high throughput, widely distributed, data management and analysis systems

    Science.gov (United States)

    Johnston, William; Ernst, M.; Dart, E.; Tierney, B.

    2014-04-01

    Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large

  12. [High Throughput Screening Analysis of Preservatives and Sweeteners in Carbonated Beverages Based on Improved Standard Addition Method].

    Science.gov (United States)

    Wang, Su-fang; Liu, Yun; Gong, Li-hua; Dong, Chun-hong; Fu, De-xue; Wang, Guo-qing

    2016-02-01

    Simulated water samples of 3 kinds of preservatives and 4 kinds of sweeteners were formulated by using orthogonal design. Kernel independent component analysis (KICA) was used to process the UV spectra of the simulated water samples and the beverages added different amounts of the additive standards, then the independent components (ICs), i. e. the UV spectral profiles of the additives, and the ICs' coefficient matrices were used to establish UV-KICA-SVR prediction model of the simulated preservatives and sweeteners solutions using support vector regression (SVR) analysis. The standards added beverages samples were obtained by adding different amounts level of additives in carbonated beverages, their UV spectra were processed by KICA, then IC information represented to the additives and other sample matrix were obtained, and the sample background can be deducted by removing the corresponding IC, other ICs' coefficient matrices were used to estimate the amounts of the additives in the standard added beverage samples based on the UV-KICA-SVR model, while the intercept of linear regression equation of predicted amounts and the added amounts in the standard added samples is the additive content in the raw beverage sample. By utilization of chemometric "blind source separation" method for extracting IC information of the tested additives in the beverage and other sample matrix, and using SVR regression modeling to improve the traditional standard addition method, a new method was proposed for the screening of the preservatives and sweeteners in carbonated beverages. The proposed UV-KICA-SVR method can be used to determine 3 kinds of preservatives and 4 kinds of sweetener in the carbonate beverages with the limit of detection (LOD) are located with the range 0.2-1.0 mg · L⁻¹, which are comparable to that of the traditional high performance liquid chromatographic (HPLC) method.

  13. High-throughput analysis by SP-LDI-MS for fast identification of adulterations in commercial balsamic vinegars

    Energy Technology Data Exchange (ETDEWEB)

    Guerreiro, Tatiane Melina; Oliveira, Diogo Noin de; Ferreira, Mônica Siqueira; Catharino, Rodrigo Ramos, E-mail: rrc@fcm.unicamp.br

    2014-08-01

    Highlights: • Rapid identification of adulteration in balsamic vinegars. • Minimal sample preparation. • No matrix required for assisting laser desorption/ionization. • Fast sample discrimination by multivariate data analysis. - Abstract: Balsamic vinegar (BV) is a typical and valuable Italian product, worldwide appreciated thanks to its characteristic flavors and potential health benefits. Several studies have been conducted to assess physicochemical and microbial compositions of BV, as well as its beneficial properties. Due to highly-disseminated claims of antioxidant, antihypertensive and antiglycemic properties, BV is a known target for frauds and adulterations. For that matter, product authentication, certifying its origin (region or country) and thus the processing conditions, is becoming a growing concern. Striving for fraud reduction as well as quality and safety assurance, reliable analytical strategies to rapidly evaluate BV quality are very interesting, also from an economical point of view. This work employs silica plate laser desorption/ionization mass spectrometry (SP-LDI-MS) for fast chemical profiling of commercial BV samples with protected geographical indication (PGI) and identification of its adulterated samples with low-priced vinegars, namely apple, alcohol and red/white wines.

  14. Characterization of transcriptional networks in blood stem and progenitor cells using high-throughput single-cell gene expression analysis.

    Science.gov (United States)

    Moignard, Victoria; Macaulay, Iain C; Swiers, Gemma; Buettner, Florian; Schütte, Judith; Calero-Nieto, Fernando J; Kinston, Sarah; Joshi, Anagha; Hannah, Rebecca; Theis, Fabian J; Jacobsen, Sten Eirik; de Bruijn, Marella F; Göttgens, Berthold

    2013-04-01

    Cellular decision-making is mediated by a complex interplay of external stimuli with the intracellular environment, in particular transcription factor regulatory networks. Here we have determined the expression of a network of 18 key haematopoietic transcription factors in 597 single primary blood stem and progenitor cells isolated from mouse bone marrow. We demonstrate that different stem/progenitor populations are characterized by distinctive transcription factor expression states, and through comprehensive bioinformatic analysis reveal positively and negatively correlated transcription factor pairings, including previously unrecognized relationships between Gata2, Gfi1 and Gfi1b. Validation using transcriptional and transgenic assays confirmed direct regulatory interactions consistent with a regulatory triad in immature blood stem cells, where Gata2 may function to modulate cross-inhibition between Gfi1 and Gfi1b. Single-cell expression profiling therefore identifies network states and allows reconstruction of network hierarchies involved in controlling stem cell fate choices, and provides a blueprint for studying both normal development and human disease.

  15. Development of a simple fluorescence-based microplate method for the high-throughput analysis of proline in wine samples.

    Science.gov (United States)

    Robert-Peillard, Fabien; Boudenne, Jean-Luc; Coulomb, Bruno

    2014-05-01

    This paper presents a simple, accurate and multi-sample method for the determination of proline in wines thanks to a 96-well microplate technique. Proline is the most abundant amino acid in wine and is an important parameter related to wine characteristics or maturation processes of grape. In the current study, an improved application of the general method based on sodium hypochlorite oxidation and o-phthaldialdehyde (OPA)-thiol spectrofluorometric detection is described. The main interfering compounds for specific proline detection in wines are strongly reduced by selective reaction with OPA in a preliminary step under well-defined pH conditions. Application of the protocol after a 500-fold dilution of wine samples provides a working range between 0.02 and 2.90gL(-1), with a limit of detection of 7.50mgL(-1). Comparison and validation on real wine samples by ion-exchange chromatography prove that this procedure yields accurate results. Simplicity of the protocol used, with no need for centrifugation or filtration, organic solvents or high temperature enables its full implementation in plastic microplates and efficient application for routine analysis of proline in wines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Multiplexed labeling of viable cells for high-throughput analysis of glycine receptor function using flow cytometry.

    Science.gov (United States)

    Gilbert, Daniel F; Wilson, John C; Nink, Virginia; Lynch, Joseph W; Osborne, Geoffrey W

    2009-05-01

    Flow cytometry is an important drug discovery tool because it permits high-content multiparameter analysis of individual cells. A new method dramatically enhanced screening throughput by multiplexing many discrete fixed cell populations; however, this method is not suited to assays requiring functional cellular responses. HEK293 cells were transfected with unique mutant glycine receptors. Mutant receptor expression was confirmed by coexpression of yellow fluorescent protein (YFP). Commercially available cell-permeant dyes were used to label each glycine receptor expressing mutant with a unique optical code. All encoded cell lines were combined in a single tube and analyzed on a flow cytometer simultaneously before and after the addition of glycine receptor agonist. We decoded multiplexed cells that expressed functionally distinct glycine receptor chloride channels and analyzed responses to glycine in terms of chloride-sensitive YFP expression. Here, data provided by flow cytometry can be used to discriminate between functional and nonfunctional mutations in the glycine receptor, a process accelerated by the use of multiplexing. Further, this data correlates to data generated using a microscopy-based technique. The present study demonstrates multiplexed labeling of live cells, to enable cell populations to be subject to further cell culture and experimentation, and compares the results with those obtained using live cell microscopy. (c) 2009 International Society for Advancement of Cytometry.

  17. Adaption of a fragment analysis technique to an automated high-throughput multicapillary electrophoresis device for the precise qualitative and quantitative characterization of microbial communities.

    Science.gov (United States)

    Trotha, René; Reichl, Udo; Thies, Frank L; Sperling, Danuta; König, Wolfgang; König, Brigitte

    2002-04-01

    The analysis of microbial communities is of increasing importance in life sciences and bioengineering. Traditional techniques of investigations like culture or cloning methods suffer from many disadvantages. They are unable to give a complete qualitative and quantitative view of the total amount of microorganisms themselves, their interactions among each other and with their environment. Obviously, the determination of static or dynamic balances among microorganisms is of fast growing interest. The generation of species specific and fluorescently labeled 16S ribosomal DNA (rDNA) fragments by the terminal restriction fragment length polymorphism (T-RFLP) technique is a suitable tool to overcome the problems other methods have. For the separation of these fragments polyacrylamide gel sequencers are preferred as compared to capillary sequencers using linear polymers until now because of their higher electrophoretic resolution and therefore sizing accuracy. But modern capillary sequencers, especially multicapillary sequencers, offer an advanced grade of automation and an increased throughput necessary for the investigation of complex communities in long-time studies. Therefore, we adapted a T-RFLP technique to an automated high-throughput multicapillary electrophoresis device (ABI 3100 Genetic Analysis) with regard to a precise qualitative and quantitative characterization of microbial communities.

  18. Comparative analysis of transcriptomes in aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing

    Directory of Open Access Journals (Sweden)

    Taketo Okada

    2016-12-01

    Full Text Available Ephedra plants are taxonomically classified as gymnosperms, and are medicinally important as the botanical origin of crude drugs and as bioresources that contain pharmacologically active chemicals. Here we show a comparative analysis of the transcriptomes of aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing by RNA-Seq. De novo assembly of short cDNA sequence reads generated 23,358, 13,373, and 28,579 contigs longer than 200 bases from aerial stems, roots, or both aerial stems and roots, respectively. The presumed functions encoded by these contig sequences were annotated by BLAST (blastx. Subsequently, these contigs were classified based on gene ontology slims, Enzyme Commission numbers, and the InterPro database. Furthermore, comparative gene expression analysis was performed between aerial stems and roots. These transcriptome analyses revealed differences and similarities between the transcriptomes of aerial stems and roots in E. sinica. Deep transcriptome sequencing of Ephedra should open the door to molecular biological studies based on the entire transcriptome, tissue- or organ-specific transcriptomes, or targeted genes of interest.

  19. Association analysis of toluene exposure time with high-throughput mRNA expressions and methylation patterns using in vivo samples.

    Science.gov (United States)

    Hong, Ji Young; Yu, So Yeon; Kim, Seol Young; Ahn, Jeong Jin; Kim, Youngjoo; Kim, Gi Won; Son, Sang Wook; Park, Jong-Tae; Hwang, Seung Yong

    2016-04-01

    The emission of volatile organic compounds (VOCs) resulting from outdoor air pollution can contribute to major public health problems. However, there has been limited research on the health effects in humans from the inhalation of VOCs. Therefore, this study conducted an in vivo analysis of the effects of toluene, one of the most commonly used chemicals in many industries, on gene expression and methylation over time using the high-throughput technique of microarray analysis. We separated participants into three groups (control, short-term exposure, and long-term exposure) to investigate the influence of toluene exposure time on gene expression. We then comprehensively analyzed and investigated the correlation between variations in gene expression and the occurrence of methylation. Twenty-six genes were upregulated and hypomethylated, while 32 genes were downregulated and hypermethylated. The pathways of these genes were confirmed to be associated with cell survival and the immune system. Based on our findings, these genes can help predict the effects of time-dependent exposure to toluene on human health. Thus, observations from our data may have implications for the identification of biomarkers of toluene exposure. Copyright © 2015. Published by Elsevier Inc.

  20. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    Science.gov (United States)

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without

  1. Analysis of the Repertoire Features of TCR Beta Chain CDR3 in Human by High-Throughput Sequencing

    Directory of Open Access Journals (Sweden)

    Xianliang Hou

    2016-07-01

    Full Text Available Background/Aims: To ward off a wide variety of pathogens, the human adaptive immune system harbors a vast array of T-cell receptors, collectively referred to as the TCR repertoire. Assessment of the repertoire features of TCR is vital for us to deeper understand of immune behaviour and immune response. Methods: In this study, we used a combination of multiplex-PCR, Illumina sequencing and IMGT (ImMunoGeneTics/HighV-QUEST for a standardized analysis of the repertoire features of TCR beta chain in the blood of healthy individuals, including the repertoire features of public TCR complementarity-determining regions (CDR3 sequences, highly expanded clones, long TCR CDR3 sequences. Results: We found that public CDR3 sequences and high-frequency sequences had the same characteristics, both of them had fewer nucleotide additions and shorter CDR3 length, which were closer to the germline sequence. Moreover, our studies provided evidence that public amino acid sequences are produced by multiple nucleotide sequences. Notably, there was skewed VDJ segment usage in long CDR3 sequences, the expression levels of 10 TRβV segments, 7 TRβJ segments and 2 TRβD segments were significantly different in the long CDR3 sequences compared to the short CDR3 sequences. Moreover, we identified that extensive N additions and increase of D gene usage contributing to TCR CDR3 length, and observed there was distinct usage frequency of amino acids in long CDR3 sequences compared to the short CDR3 sequences. Conclusions: Some repertoire features could be observed in the public sequences, highly abundance clones, and long TCR CDR3 sequences, which might be helpful for further study of immune behavior and immune response.

  2. The OncoFinder algorithm for minimizing the errors introduced by the high-throughput methods of transcriptome analysis

    Directory of Open Access Journals (Sweden)

    Anton A. Buzdin

    2014-08-01

    Full Text Available The diversity of the installed sequencing and microarray equipment make it increasingly difficult to compare and analyze the gene expression datasets obtained using the different methods. Many applications requiring high-quality and low error rates can not make use of available data using traditional analytical approaches. Recently, we proposed a new concept of signalome-wide analysis of functional changes in the intracellular pathways termed OncoFinder, a bioinformatic tool for quantitative estimation of the signaling pathway activation (SPA. We also developed methods to compare the gene expression data obtained using multiple platforms and minimizing the error rates by mapping the gene expression data onto the known and custom signaling pathways. This technique for the first time makes it possible to analyze the functional features of intracellular regulation on a mathematical basis. In this study we show that the OncoFinder method significantly reduces the errors introduced by transcriptome-wide experimental techniques. We compared the gene expression data for the same biological samples obtained by both the next generation sequencing (NGS and microarray methods. For these different techniques we demonstrate that there is virtually no correlation between the gene expression values for all datasets analyzed (R2 < 0.1. In contrast, when the OncoFinder algorithm is applied to the data we observed clear-cut correlations between the NGS and microarray gene expression datasets. The signaling pathway activation profiles obtained using NGS and microarray techniques were almost identical for the same biological samples allowing for the platform-agnostic analytical applications. We conclude that this feature of the OncoFinder enables to characterize the functional states of the transcriptomes and interactomes more accurately as before, which makes OncoFinder a method of choice for many applications including genetics, physiology, biomedicine and

  3. BR 07-1 DEVELOPMENT OF THE CELL MICROARRAY FOR HIGH-THROUGHPUT ANALYSIS OF GUT MICROBIOTA.

    Science.gov (United States)

    Hong, Seong-Tshool

    2016-09-01

    The human intestine contains a massive and complex microbial community called gut microbiota. A typical human carries 100 trillion microbes in his/her body which is 10 times greater than the number of their host cells, i.e. whole number of human cells. A combined microbial genome constituting gut microbiota is well excess our own human genome. The microbial composition of gut microbiotata and its role on diseases became a booming area of research, presenting a new paradigm of opportunities for modern medicines. Recent evidences showed that gut microbiota acts as a very important determining factor for the development of almost all complex diseases such as primary hypertension, obesity, depression, diabetes, autism, asthma, bowl diseases, rheumatic arthritis, systemic lupus erythematosus, Crohn's disease, Parkinson's disease, Alzheimer's disease, epilepsy, schizophrenia, etc. In spite of the significant role of gut microbiota in the development of complex diseases, the elucidation of the mechanistic pathway on the development of complex diseases by gut microbiota is not moving forward as expected. Current methods to identify alteration of gut microbiota in patients and healthy controls are basically based on the metagenomic sequencings of DNA samples extracted from feces by using next-generation sequencing machines. Although the metagenomic sequencing approaches proved association of gut microbiota with various complex diseases, those methods failed to accurately pinpoint the etiological agents in gut microbiota for complex diseases. The metagenomic sequencing approaches are not only difficult to identify the etiological agent of complex diseases at species level but also difficult to use, requiring complex bioinformatic analyses, and expensive. To overcome the current challenges in analysis of gut microbiota, we developed a novel cell microarray to analyze the constituent microbial organisms of gut microbiota very accurately and fast by using a drop of blood. The

  4. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  5. Targeted high-throughput sequencing of tagged nucleic acid samples

    OpenAIRE

    M.; Meyer; Stenzel, U.; Myles, S.; Prüfer, K; Hofreiter, M.

    2007-01-01

    High-throughput 454 DNA sequencing technology allows much faster and more cost-effective sequencing than traditional Sanger sequencing. However, the technology imposes inherent limitations on the number of samples that can be processed in parallel. Here we introduce parallel tagged sequencing (PTS), a simple, inexpensive and flexible barcoding technique that can be used for parallel sequencing any number and type of double-stranded nucleic acid samples. We demonstrate that PTS is particularly...

  6. Genome-wide identification and comparative analysis of grafting-responsive mRNA in watermelon grafted onto bottle gourd and squash rootstocks by high-throughput sequencing.

    Science.gov (United States)

    Liu, Na; Yang, Jinghua; Fu, Xinxing; Zhang, Li; Tang, Kai; Guy, Kateta Malangisha; Hu, Zhongyuan; Guo, Shaogui; Xu, Yong; Zhang, Mingfang

    2016-04-01

    Grafting is an important agricultural technique widely used to improve plant growth, yield, and adaptation to either biotic or abiotic stresses. However, the molecular mechanisms underlying grafting-induced physiological processes remain unclear. Watermelon (Citrullus lanatus L.) is an important horticultural crop worldwide. Grafting technique is commonly used in watermelon production for improving its tolerance to stresses, especially to the soil-borne fusarium wilt disease. In the present study, we used high-throughput sequencing to perform a genome-wide transcript analysis of scions from watermelon grafted onto bottle gourd and squash rootstocks. Our transcriptome and digital gene expression (DGE) profiling data provided insights into the molecular aspects of gene regulation in grafted watermelon. Compared with self-grafted watermelon, there were 787 and 3485 genes differentially expressed in watermelon grafted onto bottle gourd and squash rootstocks, respectively. These genes were associated with primary and secondary metabolism, hormone signaling, transcription factors, transporters, and response to stimuli. Grafting led to changes in expression of these genes, suggesting that they may play important roles in mediating the physiological processes of grafted seedlings. The potential roles of the grafting-responsive mRNAs in diverse biological and metabolic processes were discussed. Obviously, the data obtained in this study provide an excellent resource for unraveling the mechanisms of candidate genes function in diverse biological processes and in environmental adaptation in a graft system.

  7. BiForce Toolbox: powerful high-throughput computational analysis of gene-gene interactions in genome-wide association studies.

    Science.gov (United States)

    Gyenesei, Attila; Moody, Jonathan; Laiho, Asta; Semple, Colin A M; Haley, Chris S; Wei, Wen-Hua

    2012-07-01

    Genome-wide association studies (GWAS) have discovered many loci associated with common disease and quantitative traits. However, most GWAS have not studied the gene-gene interactions (epistasis) that could be important in complex trait genetics. A major challenge in analysing epistasis in GWAS is the enormous computational demands of analysing billions of SNP combinations. Several methods have been developed recently to address this, some using computers equipped with particular graphical processing units, most restricted to binary disease traits and all poorly suited to general usage on the most widely used operating systems. We have developed the BiForce Toolbox to address the demand for high-throughput analysis of pairwise epistasis in GWAS of quantitative and disease traits across all commonly used computer systems. BiForce Toolbox is a stand-alone Java program that integrates bitwise computing with multithreaded parallelization and thus allows rapid full pairwise genome scans via a graphical user interface or the command line. Furthermore, BiForce Toolbox incorporates additional tests of interactions involving SNPs with significant marginal effects, potentially increasing the power of detection of epistasis. BiForce Toolbox is easy to use and has been applied in multiple studies of epistasis in large GWAS data sets, identifying interesting interaction signals and pathways.

  8. Direct atomic-level observation and chemical analysis of ZnSe synthesized by in situ high-throughput reactive fiber drawing.

    Science.gov (United States)

    Hou, Chong; Jia, Xiaoting; Wei, Lei; Stolyarov, Alexander M; Shapira, Ofer; Joannopoulos, John D; Fink, Yoel

    2013-03-13

    We demonstrate a high-throughput method for synthesizing zinc selenide (ZnSe) in situ during fiber drawing. Central to this method is a thermally activated chemical reaction occurring across multiple interfaces between alternately layered elemental zinc- (Zn-) and selenium- (Se-) rich films embedded in a preform and drawn into meters of fiber at a temperature well below the melting temperature of either Zn or ZnSe. By depositing 50 nm thick layers of Zn interleaved between 1 μm thick Se layers, a controlled breakup of the Zn sheet is achieved, thereby enabling a complete and controlled chemical reaction. The thermodynamics and kinetics of this synthesis process are studied using thermogravimetric analysis and differential scanning calorimetry, and the in-fiber compound is analyzed by a multiplicity of materials characterization tools, including transmission electron microscopy, Raman microscopy, energy-dispersive X-ray spectroscopy, and X-ray diffraction, all resulting in unambiguous identification of ZnSe as the compound produced from the reactive fiber draw. Furthermore, we characterize the in-fiber ZnSe/Se97S3 heterojunction to demonstrate the prospect of ZnSe-based fiber optoelectronic devices. The ability to synthesize new compounds during fiber drawing at nanometer scale precision and to characterize them at the atomic-level extends the architecture and materials selection compatible with multimaterial fiber drawing, thus paving the way toward more complex and sophisticated functionality.

  9. High-throughput pyrosequencing analysis of bacteria relevant to cometabolic and metabolic degradation of ibuprofen in horizontal subsurface flow constructed wetlands.

    Science.gov (United States)

    Li, Yifei; Wu, Bing; Zhu, Guibing; Liu, Yu; Ng, Wun Jern; Appan, Adhityan; Tan, Soon Keat

    2016-08-15

    The potential toxicity of pharmaceutical residues including ibuprofen on the aquatic vertebrates and invertebrates has attracted growing attention to the pharmaceutical pollution control using constructed wetlands, but there lacks of an insight into the relevant microbial degradation mechanisms. This study investigated the bacteria associated with the cometabolic and metabolic degradation of ibuprofen in a horizontal subsurface flow constructed wetland system by high-throughput pyrosequencing analysis. The ibuprofen degradation dynamics, bacterial diversity and evenness, and bacterial community structure in a planted bed with Typha angustifolia and an unplanted bed (control) were compared. The results showed that the plants promoted the microbial degradation of ibuprofen, especially at the downstream zones of wetland. However, at the upstream one-third zone of wetland, the presence of plants did not significantly enhance ibuprofen degradation, probably due to the much greater contribution of cometabolic behaviors of certain non-ibuprofen-degrading microorganisms than that of the plants. By analyzing bacterial characteristics, we found that: (1) The aerobic species of family Flavobacteriaceae, family Methylococcaceae and genus Methylocystis, and the anaerobic species of family Spirochaetaceae and genus Clostridium_sensu_stricto were the most possible bacteria relevant to the cometabolic degradation of ibuprofen; (2) The family Rhodocyclaceae and the genus Ignavibacterium closely related to the plants appeared to be associated with the metabolic degradation of ibuprofen.

  10. High-Throughput Analysis of Methylmalonic Acid in Serum, Plasma, and Urine by LC-MS/MS. Method for Analyzing Isomers Without Chromatographic Separation.

    Science.gov (United States)

    Kushnir, Mark M; Nelson, Gordon J; Frank, Elizabeth L; Rockwood, Alan L

    2016-01-01

    Measurement of methylmalonic acid (MMA) plays an important role in the diagnosis of vitamin B12 deficiency. Vitamin B12 is an essential cofactor for the enzymatic carbon rearrangement of methylmalonyl-CoA (MMA-CoA) to succinyl-CoA (SA-CoA), and the lack of vitamin B12 leads to elevated concentrations of MMA. Presence of succinic acid (SA) complicates the analysis because mass spectra of MMA and SA are indistinguishable, when analyzed in negative ion mode and the peaks are difficult to resolve chromatographically. We developed a method for the selective analysis of MMA that exploits the significant difference in fragmentation patterns of di-butyl derivatives of the isomers MMA and SA in a tandem mass spectrometer when analyzed in positive ion mode. Tandem mass spectra of di-butyl derivatives of MMA and SA are very distinct; this allows selective analysis of MMA in the presence of SA. The instrumental analysis is performed using liquid chromatography-tandem mass spectrometry (LC-MS/MS) in positive ion mode, which is, in combination with selective extraction of acidic compounds, is highly selective for organic acids with multiple carboxyl groups (dicarboxylic, tricarboxylic, etc.). In this method organic acids with a single carboxyl group are virtually undetectable in the mass spectrometer; the only organic acid, other than MMA, that is detected by this method is its isomer, SA. Quantitative measurement of MMA in this method is performed using a deconvolution algorithm, which mathematically resolves the signal corresponding to MMA and does not require chromatographic resolution of the MMA and SA peaks. Because of its high selectivity, the method utilizes isocratic chromatographic separation; reconditioning and re-equilibration of the chromatographic column between injections is unnecessary. The above features of the method allow high-throughput analysis of MMA with analysis cycle time of 1 min.

  11. Inferential literacy for experimental high-throughput biology.

    Science.gov (United States)

    Miron, Mathieu; Nadon, Robert

    2006-02-01

    Many biologists believe that data analysis expertise lags behind the capacity for producing high-throughput data. One view within the bioinformatics community is that biological scientists need to develop algorithmic skills to meet the demands of the new technologies. In this article, we argue that the broader concept of inferential literacy, which includes understanding of data characteristics, experimental design and statistical analysis, in addition to computation, more adequately encompasses what is needed for efficient progress in high-throughput biology.

  12. An efficient sample preparation method for high-throughput analysis of 15(S)-8-iso-PGF2α in plasma and urine by enzyme immunoassay.

    Science.gov (United States)

    Bielecki, A; Saravanabhavan, G; Blais, E; Vincent, R; Kumarathasan, P

    2012-01-01

    Although several methods have been reported on the analysis of the oxidative stress marker 15(S)-8-iso-prostaglandin-F2alpha (8-iso-PGF2α) in biological fluids, they either involve extensive sample preparation and costly technology or require high sample volume. This study presents a sample preparation method that utilizes low sample volume for 8-iso-PGF2α analysis in plasma and urine by an enzyme immunoassay (EIA). In brief, 8-iso-PGF2α in deproteinized plasma or native urine sample is complexed with an antibody and then captured by molecular weight cut-off filtration. This method was compared with two other sample preparation methods that are typically used in the analysis of 8-iso-PGF2α by EIA: Cayman's affinity column purification method and solid-phase extraction on C-18. The immunoaffinity purification method described here was superior to the other two sample preparation methods and yielded recovery values of 99.8 and 54.1% for 8-iso-PGF2α in plasma and urine, respectively. Analytical precision (relative standard deviation) was ±5% for plasma and ±15% for urine. The analysis of healthy human plasma and urine resulted in basal 8-iso-PGF2α levels of 31.8 ± 5.5 pg/mL and 2.9 ± 2.0 ng/mg creatinine, respectively. The robustness and analytical performance of this method makes it a promising tool for high-throughput screening of biological samples for 8-iso-PGF2α.

  13. Immunoglobulin G (IgG) Fab glycosylation analysis using a new mass spectrometric high-throughput profiling method reveals pregnancy-associated changes.

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H J; Hensbergen, Paul J; Reiding, Karli R; Hazes, Johanna M W; Dolhain, Radboud J E M; Wuhrer, Manfred

    2014-11-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼ 20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding.

  14. Line-edge quality optimization of electron beam resist for high-throughput character projection exposure utilizing atomic force microscope analysis

    Science.gov (United States)

    Ikeno, Rimon; Mita, Yoshio; Asada, Kunihiro

    2017-04-01

    High-throughput electron-beam lithography (EBL) by character projection (CP) and variable-shaped beam (VSB) methods is a promising technique for low-to-medium volume device fabrication with regularly arranged layouts, such as standard-cell logics and memory arrays. However, non-VLSI applications like MEMS and MOEMS may not fully utilize the benefits of CP method due to their wide variety of layout figures including curved and oblique edges. In addition, the stepwise shapes that appear on such irregular edges by VSB exposure often result in intolerable edge roughness, which may degrade performances of the fabricated devices. In our former study, we proposed a general EBL methodology for such applications utilizing a combination of CP and VSB methods, and demonstrated its capabilities in electron beam (EB) shot reduction and edge-quality improvement by using a leading-edge EB exposure tool, ADVANTEST F7000S-VD02, and high-resolution Hydrogen Silsesquioxane resist. Both scanning electron microscope and atomic force microscope observations were used to analyze quality of the resist edge profiles to determine the influence of the control parameters used in the exposure-data preparation process. In this study, we carried out detailed analysis of the captured edge profiles utilizing Fourier analysis, and successfully distinguish the systematic undulation by the exposed CP character profiles from random roughness components. Such capability of precise edge-roughness analysis is useful to our EBL methodology to maintain both the line-edge quality and the exposure throughput by optimizing the control parameters in the layout data conversion.

  15. A cost-effectiveness analysis of two different antimicrobial stewardship programs.

    Science.gov (United States)

    Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia

    2016-01-01

    There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective.

  16. Cost effectiveness analysis of strategies for tuberculosis control in developing countries

    NARCIS (Netherlands)

    K. Floyd (Katherine); C. Dye; R.M.P.M. Baltussen (Rob)

    2005-01-01

    textabstractOBJECTIVE: To assess the costs and health effects of tuberculosis control interventions in Africa and South East Asia in the context of the millennium development goals. DESIGN: Cost effectiveness analysis based on an epidemiological model. SETTING: Analyses undertaken

  17. Antibiotic prophylaxis for haematogenous bacterial arthritis in patients with joint disease: a cost effectiveness analysis

    NARCIS (Netherlands)

    P. Krijnen (Pieta); C.J. Kaandorp; E.W. Steyerberg (Ewout); D. van Schaardenburg (Dirkjan); H.J. Moens; J.D.F. Habbema (Dik)

    2001-01-01

    textabstractOBJECTIVE: To assess the cost effectiveness of antibiotic prophylaxis for haematogenous bacterial arthritis in patients with joint disease. METHODS: In a decision analysis, data from a prospective study on bacterial arthritis in 4907 patients with joint dise

  18. C. elegans in high-throughput drug discovery

    OpenAIRE

    O’Reilly, Linda P.; Cliff J Luke; Perlmutter, David H.; Silverman, Gary A.; Pak, Stephen C.

    2013-01-01

    C. elegans has proven to be a useful model organism for investigating molecular and cellular aspects of numerous human diseases. More recently, investigators have explored the use of this organism as a tool for drug discovery. Although earlier drug screens were labor-intensive and low in throughput, recent advances in high-throughput liquid workflows, imaging platforms and data analysis software have made C. elegans a viable option for automated high-throughput drug screens. This review will ...

  19. High-throughput RNA sequencing-based virome analysis of 50 lymphoma cell lines from the Cancer Cell Line Encyclopedia project.

    Science.gov (United States)

    Cao, Subing; Strong, Michael J; Wang, Xia; Moss, Walter N; Concha, Monica; Lin, Zhen; O'Grady, Tina; Baddoo, Melody; Fewell, Claire; Renne, Rolf; Flemington, Erik K

    2015-01-01

    Using high-throughput RNA sequencing data from 50 common lymphoma cell culture models from the Cancer Cell Line Encyclopedia project, we performed an unbiased global interrogation for the presence of a panel of 740 viruses and strains known to infect human and other mammalian cells. This led to the findings of previously identified infections by Epstein-Barr virus (EBV), Kaposi's sarcoma herpesvirus (KSHV), and human T-lymphotropic virus type 1 (HTLV-1). In addition, we also found a previously unreported infection of one cell line (DEL) with a murine leukemia virus. High expression of murine leukemia virus (MuLV) transcripts was observed in DEL cells, and we identified four transcriptionally active integration sites, one being in the TNFRSF6B gene. We also found low levels of MuLV reads in a number of other cell lines and provided evidence suggesting cross-contamination during sequencing. Analysis of HTLV-1 integrations in two cell lines, HuT 102 and MJ, identified 14 and 66 transcriptionally active integration sites with potentially activating integrations in immune regulatory genes, including interleukin-15 (IL-15), IL-6ST, STAT5B, HIVEP1, and IL-9R. Although KSHV and EBV do not typically integrate into the genome, we investigated a previously identified integration of EBV into the BACH2 locus in Raji cells. This analysis identified a BACH2 disruption mechanism involving splice donor sequestration. Through viral gene expression analysis, we detected expression of stable intronic RNAs from the EBV BamHI W repeats that may be part of long transcripts spanning the repeat region. We also observed transcripts at the EBV vIL-10 locus exclusively in the Hodgkin's lymphoma cell line, Hs 611.T, the expression of which were uncoupled from other lytic genes. Assessment of the KSHV viral transcriptome in BCP-1 cells showed expression of the viral immune regulators, K2/vIL-6, K4/vIL-8-like vCCL1, and K5/E2-ubiquitin ligase 1 that was significantly higher than expression of

  20. Comparing Usage and Cost- Effectiveness Analysis of English Printed and Electronic Books for University of Tehran

    OpenAIRE

    Davoud Haseli; Nader Naghshineh; fatemeh Fahimnia

    2014-01-01

    Libraries operate in a competitive environment, and this is essentially needed to prove its benefits for stockholders, and continuously evaluate and compare advantages for printed and electronic resources. In these cases, economic evaluation methods such as cost- effectiveness analysis, is one of the best methods, because of a comprehensive study of the use and cost of library sources. The purpose of this study is to discovery of use and cost- effectiveness analysis of English printed and ebo...

  1. COST-EFFECTIVENESS ANALYSIS OF ANTI-DIABETIC THERAPY IN A UNIVERSITY TEACHING HOSPITAL

    OpenAIRE

    Giwa Abdulganiyu; Tayo Fola

    2014-01-01

    Purpose: To conduct cost-effectiveness analysis of anti-diabetic therapy in a University Teaching Hospital in 2010. Methods: A retrospective review of selected case-notes was conducted. World Health Organization Defined Daily Dose Method of evaluating drug use and probability method for potential effectiveness of antidiabetic therapeutic options from literature analysis was employed in determining cost-effectiveness of each anti-diabetic therapeutic option identified from anti-diabetic dru...

  2. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  3. Associations between the human intestinal microbiota, Lactobacillus rhamnosus GG and serum lipids indicated by integrated analysis of high-throughput profiling data

    Directory of Open Access Journals (Sweden)

    Leo Lahti

    2013-02-01

    Full Text Available Accumulating evidence indicates that the intestinal microbiota regulates our physiology and metabolism. Bacteria marketed as probiotics confer health benefits that may arise from their ability to affect the microbiota. Here high-throughput screening of the intestinal microbiota was carried out and integrated with serum lipidomic profiling data to study the impact of probiotic intervention on the intestinal ecosystem, and to explore the associations between the intestinal bacteria and serum lipids. We performed a comprehensive intestinal microbiota analysis using a phylogenetic microarray before and after Lactobacillus rhamnosus GG intervention. While a specific increase in the L. rhamnosus-related bacteria was observed during the intervention, no other changes in the composition or stability of the microbiota were detected. After the intervention, lactobacilli returned to their initial levels. As previously reported, also the serum lipid profiles remained unaltered during the intervention. Based on a high-resolution microbiota analysis, intake of L. rhamnosus GG did not modify the composition of the intestinal ecosystem in healthy adults, indicating that probiotics confer their health effects by other mechanisms. The most prevailing association between the gut microbiota and lipid profiles was a strong positive correlation between uncultured phylotypes of Ruminococcus gnavus-group and polyunsaturated serum triglycerides of dietary origin. Moreover, a positive correlation was detected between serum cholesterol and Collinsella (Coriobacteriaceae. These associations identified with the spectrometric lipidome profiling were corroborated by enzymatically determined cholesterol and triglyceride levels. Actinomycetaceae correlated negatively with triglycerides of highly unsaturated fatty acids while a set of Proteobacteria showed negative correlation with ether phosphatidylcholines. Our results suggest that several members of the Firmicutes

  4. Associations between the human intestinal microbiota, Lactobacillus rhamnosus GG and serum lipids indicated by integrated analysis of high-throughput profiling data.

    Science.gov (United States)

    Lahti, Leo; Salonen, Anne; Kekkonen, Riina A; Salojärvi, Jarkko; Jalanka-Tuovinen, Jonna; Palva, Airi; Orešič, Matej; de Vos, Willem M

    2013-01-01

    Accumulating evidence indicates that the intestinal microbiota regulates our physiology and metabolism. Bacteria marketed as probiotics confer health benefits that may arise from their ability to affect the microbiota. Here high-throughput screening of the intestinal microbiota was carried out and integrated with serum lipidomic profiling data to study the impact of probiotic intervention on the intestinal ecosystem, and to explore the associations between the intestinal bacteria and serum lipids. We performed a comprehensive intestinal microbiota analysis using a phylogenetic microarray before and after Lactobacillus rhamnosus GG intervention. While a specific increase in the L. rhamnosus-related bacteria was observed during the intervention, no other changes in the composition or stability of the microbiota were detected. After the intervention, lactobacilli returned to their initial levels. As previously reported, also the serum lipid profiles remained unaltered during the intervention. Based on a high-resolution microbiota analysis, intake of L. rhamnosus GG did not modify the composition of the intestinal ecosystem in healthy adults, indicating that probiotics confer their health effects by other mechanisms. The most prevailing association between the gut microbiota and lipid profiles was a strong positive correlation between uncultured phylotypes of Ruminococcus gnavus-group and polyunsaturated serum triglycerides of dietary origin. Moreover, a positive correlation was detected between serum cholesterol and Collinsella (Coriobacteriaceae). These associations identified with the spectrometric lipidome profiling were corroborated by enzymatically determined cholesterol and triglyceride levels. Actinomycetaceae correlated negatively with triglycerides of highly unsaturated fatty acids while a set of Proteobacteria showed negative correlation with ether phosphatidylcholines. Our results suggest that several members of the Firmicutes, Actinobacteria and

  5. Improvement of High-throughput Genotype Analysis After Implementation of a Dual-curve Sybr Green I-based Quantification and Normalization Procedure

    Science.gov (United States)

    The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...

  6. Analysis of the effects of five factors relevant to in vitro chondrogenesis of human mesenchymal stem cells using factorial design and high throughput mRNA-profiling.

    Directory of Open Access Journals (Sweden)

    Rune B Jakobsen

    Full Text Available The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols.

  7. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    Science.gov (United States)

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program.

  8. High-Throughput Analysis of the T Cell Receptor Beta Chain Repertoire in PBMCs from Chronic Hepatitis B Patients with HBeAg Seroconversion

    Directory of Open Access Journals (Sweden)

    Yachao Qu

    2016-01-01

    Full Text Available T lymphocytes are the most important immune cells that affect both the development and treatment of hepatitis B. We used high-throughput sequencing to determine the diversity in the V and J regions of the TCRβ chain in 4 chronic hepatitis B patients before and after HBeAg seroconversion. Here, we demonstrate that the 4 patients expressed Vβ12-4 at the highest frequencies of 10.6%, 9.2%, 17.5%, and 7.5%, and Vβ28 was the second most common, with frequencies of 7.8%, 6.7%, 5.3%, and 10.9%, respectively. No significant changes were observed following seroconversion. With regard to the Jβ gene, Jβ2-1 was the most commonly expressed in the 4 patients at frequencies of 5.8%, 6.5%, 11.3%, and 7.3%, respectively. Analysis of the V-J region genes revealed several differences, including significant increases in the expression levels of V7-2-01-J2-1, V12-4-J1-1, and V28-1-J1-5 and a decrease in that of V19-01-J2-3. These results illustrate the presence of biased TCRVβ and Jβ gene expression in the chronic hepatitis B patients. TRBVβ12-4, Vβ28, Jβ2-1, V7-2-01-J2-1, V12-4-J1-1, and V28-1-J1-5 may be associated with the development and treatment of CHB.

  9. High-Throughput Analysis of the T Cell Receptor Beta Chain Repertoire in PBMCs from Chronic Hepatitis B Patients with HBeAg Seroconversion

    Science.gov (United States)

    Huang, Yong; Liu, Di; Huang, Yinuo; Zhang, Zhiyi

    2016-01-01

    T lymphocytes are the most important immune cells that affect both the development and treatment of hepatitis B. We used high-throughput sequencing to determine the diversity in the V and J regions of the TCRβ chain in 4 chronic hepatitis B patients before and after HBeAg seroconversion. Here, we demonstrate that the 4 patients expressed Vβ12-4 at the highest frequencies of 10.6%, 9.2%, 17.5%, and 7.5%, and Vβ28 was the second most common, with frequencies of 7.8%, 6.7%, 5.3%, and 10.9%, respectively. No significant changes were observed following seroconversion. With regard to the Jβ gene, Jβ2-1 was the most commonly expressed in the 4 patients at frequencies of 5.8%, 6.5%, 11.3%, and 7.3%, respectively. Analysis of the V-J region genes revealed several differences, including significant increases in the expression levels of V7-2-01-J2-1, V12-4-J1-1, and V28-1-J1-5 and a decrease in that of V19-01-J2-3. These results illustrate the presence of biased TCRVβ and Jβ gene expression in the chronic hepatitis B patients. TRBVβ12-4, Vβ28, Jβ2-1, V7-2-01-J2-1, V12-4-J1-1, and V28-1-J1-5 may be associated with the development and treatment of CHB. PMID:27818694

  10. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data [version 3; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Damien Correia

    2016-12-01

    Full Text Available The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS or Next-Generation Sequencing (NGS technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS, solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power. Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration

  11. High-Throughput Single-Particle Analysis of Metal-Enhanced Fluorescence in Free Solution Using Ag@SiO2 Core-Shell Nanoparticles.

    Science.gov (United States)

    Yan, Ya; Meng, Lingyan; Zhang, Wenqiang; Zheng, Yan; Wang, Shuo; Ren, Bin; Yang, Zhilin; Yan, Xiaomei

    2017-09-22

    Metal-enhanced fluorescence (MEF) based on localized surface plasmon resonance (LSPR) is an effective strategy to increase the detection sensitivity in biotechnology and biomedicine. Because plasmonic nanoparticles are intrinsically heterogeneous, high-throughput single-particle analysis of MEF in free solution are highly demanded for the mechanistic understanding and control of this nanoscale process. Here, we report the application of a laboratory-built high-sensitivity flow cytometer (HSFCM) to investigate the fluorescence-enhancing effect of individual plasmonic nanoparticles on nearby fluorophore molecules. Ag@SiO2 core-shell nanoparticles were used as the model system which comprised a silver core, a silica shell, and an FITC-doped thin layer of silica shell. FITC-doped silica nanoparticles of the same particle size but without silver core were used as the counterparts. Both the side scattering and fluorescence signals of single nanoparticles in suspension were measured simultaneously by the HSFCM at a speed of thousands of particles per minute. The roles of silver core size (40-100 nm) and fluorophore-metal distance (5-30 nm) were systematically examined. Fluorescence enhancement factor exceeding 30 was observed at silver core size of 70 nm and silica shell thickness of 5 nm. Compared with ensemble-averaged spectrofluorometric measurements, our experimental observation at the single-particle level was well supported by the finite difference time domain (FDTD) calculation. It allows us to achieve a fundamental understanding of MEF, which is important to the design and control of plasmonic nanostructures for efficient fluorescence enhancement.

  12. Mapping whole-brain activity with cellular resolution by light-sheet microscopy and high-throughput image analysis (Conference Presentation)

    Science.gov (United States)

    Silvestri, Ludovico; Rudinskiy, Nikita; Paciscopi, Marco; Müllenbroich, Marie Caroline; Costantini, Irene; Sacconi, Leonardo; Frasconi, Paolo; Hyman, Bradley T.; Pavone, Francesco S.

    2016-03-01

    Mapping neuronal activity patterns across the whole brain with cellular resolution is a challenging task for state-of-the-art imaging methods. Indeed, despite a number of technological efforts, quantitative cellular-resolution activation maps of the whole brain have not yet been obtained. Many techniques are limited by coarse resolution or by a narrow field of view. High-throughput imaging methods, such as light sheet microscopy, can be used to image large specimens with high resolution and in reasonable times. However, the bottleneck is then moved from image acquisition to image analysis, since many TeraBytes of data have to be processed to extract meaningful information. Here, we present a full experimental pipeline to quantify neuronal activity in the entire mouse brain with cellular resolution, based on a combination of genetics, optics and computer science. We used a transgenic mouse strain (Arc-dVenus mouse) in which neurons which have been active in the last hours before brain fixation are fluorescently labelled. Samples were cleared with CLARITY and imaged with a custom-made confocal light sheet microscope. To perform an automatic localization of fluorescent cells on the large images produced, we used a novel computational approach called semantic deconvolution. The combined approach presented here allows quantifying the amount of Arc-expressing neurons throughout the whole mouse brain. When applied to cohorts of mice subject to different stimuli and/or environmental conditions, this method helps finding correlations in activity between different neuronal populations, opening the possibility to infer a sort of brain-wide 'functional connectivity' with cellular resolution.

  13. Cost-effectiveness analysis of interventions for migraine in four low- and middle-income countries

    OpenAIRE

    Linde, Mattias; Steiner, Timothy J.; Chisholm, Dan

    2015-01-01

    Background: Evidence of the cost and effects of interventions for reducing the global burden of migraine remains scarce. Our objective was to estimate the population-level cost-effectiveness of evidence-based migraine interventions and their contributions towards reducing current burden in low- and middle-income countries. Methods: Using a standard WHO approach to cost-effectiveness analysis (CHOICE), we modelled core set intervention strategies for migraine, taking account of cov...

  14. Cost-effectiveness analysis of interventions for migraine in four low- and middle-income countries

    OpenAIRE

    Linde, Mattias; Steiner, Timothy J.; Chisholm, Dan

    2015-01-01

    Background Evidence of the cost and effects of interventions for reducing the global burden of migraine remains scarce. Our objective was to estimate the population-level cost-effectiveness of evidence-based migraine interventions and their contributions towards reducing current burden in low- and middle-income countries. Methods Using a standard WHO approach to cost-effectiveness analysis (CHOICE), we modelled core set intervention strategies for migraine, taking account of coverage and effi...

  15. Bayesian Variable Selection in Cost-Effectiveness Analysis

    Directory of Open Access Journals (Sweden)

    Miguel A. Negrín

    2010-04-01

    Full Text Available Linear regression models are often used to represent the cost and effectiveness of medical treatment. The covariates used may include sociodemographic variables, such as age, gender or race; clinical variables, such as initial health status, years of treatment or the existence of concomitant illnesses; and a binary variable indicating the treatment received. However, most studies estimate only one model, which usually includes all the covariates. This procedure ignores the question of uncertainty in model selection. In this paper, we examine four alternative Bayesian variable selection methods that have been proposed. In this analysis, we estimate the inclusion probability of each covariate in the real model conditional on the data. Variable selection can be useful for estimating incremental effectiveness and incremental cost, through Bayesian model averaging, as well as for subgroup analysis.

  16. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus.

    Directory of Open Access Journals (Sweden)

    Malin Ulfsdotter

    Full Text Available There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC. The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness.A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data.The cost was € 326.3 per parent, of which € 53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and € 272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of € 47 290 per gained QALY. The sensitivity analyses resulted in ratios from € 41 739 to € 55 072. With the common Swedish threshold value of € 55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent.Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation.

  17. Feasibility of a cost-effective, video analysis software-based mobility protocol for objective spine kinematics and gait metrics: a proof of concept study.

    Science.gov (United States)

    Paul, Justin C; Petrizzo, Anthony; Rizzo, John-Ross; Bianco, Kristina; Maier, Stephen; Errico, Thomas J; Lafage, Virginie

    2015-03-01

    The purpose of this study was to investigate the potential of a high-throughput, easily implemented, cost-effective, video analysis software-based mobility protocol to quantify spine kinematics. This prospective cohort study of clinical biomechanics implemented 2-dimensional (2D) image processing at a tertiary-care academic institution. Ten healthy, able-bodied volunteers were recruited for 2D videography of gait and functional motion. The reliability of a 2D video analysis software program for gait and range of motion metrics was evaluated over 2 independent experimental sessions, assessing for inter-trial, inter-session, and inter-rater reliability. Healthy volunteers were evaluated for simple forward and side bending, rotation, treadmill stride length, and more complex seated-to-standing tasks. Based on established intraclass correlation coefficients, results indicated that reliability was considered good to excellent for simple forward and side bending, rotation, stride length, and more complex sit-to-standing tasks. In conclusion, a cost-effective, 2D, video analysis software-based mobility protocol represents a feasible and clinically useful approach for objective spine kinematics and gait metrics. As the complication rate of operative management in the setting of spinal deformity is weighed against functional performance and quality of life measures, an objective analysis tool in combination with an appropriate protocol will aid in clinical assessments and lead to an increased evidence base for management options and decision algorithms.

  18. High-throughput analysis of amphetamines in blood and urine with online solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Fernández, María del Mar Ramírez; Wille, Sarah M R; Samyn, Nele; Wood, Michelle; López-Rivadulla, Manuel; De Boeck, Gert

    2009-01-01

    An automated online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of amphetamines in blood and urine was developed and validated. Chromatographic separation was achieved on a Nucleodur Sphinx RP column with an LC gradient (a mixture of 10 mM ammonium formate buffer and acetonitrile), ensuring the elution of amphetamine, methamphetamine, MDMA, MDA, MDEA, PMA, and ephedrine within 11 min. The method was fully validated, according to international guidelines, using only 100 and 50 microL of blood and urine, respectively. The method showed an excellent intra- and interassay precision (relative standard deviation 0.99, 2.5-400 microg/L for blood and 25-1000 microg/L for urine). Limits of quantification were determined to be 2.5 and 25 microg/L for blood and urine, respectively. Limits of detection ranged from 0.05 to 0.5 microg/L for blood and 0.25 to 2.5 microg/L for urine, depending on the compound. Furthermore, the analytes and the processed samples were demonstrated to be stable (in the autosampler for at least 72 h and after three freeze/thaw cycles), and no disturbing matrix effects were observed for all compounds. Moreover, no carryover was observed after the analysis of high concentration samples (15,000 microg/L). The method was subsequently applied to authentic blood and urine samples obtained from forensic cases, which covered a broad range of concentrations. The validation results and actual sample analyses demonstrated that this method is rugged, precise, accurate, and well-suited for routine analysis as more than 72 samples are analyzed non-stop in 24 h with minimum sample handling. The combination of the high-throughput online SPE and the well-known sensitivity and selectivity assured by MS-MS resulted in the elimination of the bottleneck associated with the sample preparation requirements and provided increased sensitivity, accuracy, and precision.

  19. Heterogeneous Deployment Analysis for Cost-Effective Mobile Network Evolution

    DEFF Research Database (Denmark)

    Coletti, Claudio

    2013-01-01

    The plethora of connected devices, such as attractive smartphones, data dongles and 3G/4G built-in tablet computers, has brought mobile operators to face increasing demand in mobile broadband traffic and services. In addition to the roll-out of Long Term Evolution (LTE), the deployment of small low...... available at the macro layer for wireless backhaul. The main goal is to investigate the LTE downlink performance of different deployment configurations, focusing on spectrum allocation schemes and deployment strategies that are needed to maximize network coverage. Differently from most studies using...... statistical models of deployment areas, the performance analysis is carried out in the form of operator case studies for large-scale deployment scenarios, including realistic macro network layouts and inhomogeneous spatial traffic distributions. Deployment of small cells is performed by means of proposed...

  20. High energy x-ray diffraction/x-ray fluorescence spectroscopy for high-throughput analysis of composition spread thin films.

    Science.gov (United States)

    Gregoire, John M; Dale, Darren; Kazimirov, Alexander; DiSalvo, Francis J; van Dover, R Bruce

    2009-12-01

    High-throughput crystallography is an important tool in materials research, particularly for the rapid assessment of structure-property relationships. We present a technique for simultaneous acquisition of diffraction images and fluorescence spectra on a continuous composition spread thin film using a 60 keV x-ray source. Subsequent noninteractive data processing provides maps of the diffraction profiles, thin film fiber texture, and composition. Even for highly textured films, our diffraction technique provides detection of diffraction from each family of Bragg reflections, which affords direct comparison of the measured profiles with powder patterns of known phases. These techniques are important for high throughput combinatorial studies as they provide structure and composition maps which may be correlated with performance trends within an inorganic library.

  1. Cost-effectiveness analysis of combination therapies for visceral leishmaniasis in the Indian subcontinent.

    Directory of Open Access Journals (Sweden)

    Filip Meheus

    Full Text Available BACKGROUND: Visceral leishmaniasis is a systemic parasitic disease that is fatal unless treated. We assessed the cost and cost-effectiveness of alternative strategies for the treatment of visceral leishmaniasis in the Indian subcontinent. In particular we examined whether combination therapies are a cost-effective alternative compared to monotherapies. METHODS AND FINDINGS: We assessed the cost-effectiveness of all possible mono- and combination therapies for the treatment of visceral leishmaniasis in the Indian subcontinent (India, Nepal and Bangladesh from a societal perspective using a decision analytical model based on a decision tree. Primary data collected in each country was combined with data from the literature and an expert poll (Delphi method. The cost per patient treated and average and incremental cost-effectiveness ratios expressed as cost per death averted were calculated. Extensive sensitivity analysis was done to evaluate the robustness of our estimations and conclusions. With a cost of US$92 per death averted, the combination miltefosine-paromomycin was the most cost-effective treatment strategy. The next best alternative was a combination of liposomal amphotericin B with paromomycin with an incremental cost-effectiveness of $652 per death averted. All other strategies were dominated with the exception of a single dose of 10mg per kg of liposomal amphotericin B. While strategies based on liposomal amphotericin B (AmBisome were found to be the most effective, its current drug cost of US$20 per vial resulted in a higher average cost-effectiveness. Sensitivity analysis showed the conclusion to be robust to variations in the input parameters over their plausible range. CONCLUSIONS: Combination treatments are a cost-effective alternative to current monotherapy for VL. Given their expected impact on the emergence of drug resistance, a switch to combination therapy should be considered once final results from clinical trials are

  2. Challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials

    Directory of Open Access Journals (Sweden)

    Yunbo Chu

    2016-10-01

    Full Text Available Economic evaluation in the form of cost-effectiveness analysis has become a popular means to inform decisions in healthcare. With multi-regional clinical trials in a global development program becoming a new venue for drug efficacy testing in recent decades, questions in methods for cost-effectiveness analysis in the multi-regional clinical trials setting also emerge. This paper addresses some challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials. Several discussion points are raised for further attention and a multi-regional clinical trial example is presented to illustrate the implications in industrial application. A general message is delivered to call for a depth discussion by all stakeholders to reach an agreement on a good practice in cost-effectiveness analysis in the multi-regional clinical trials. Meanwhile, we recommend an additional consideration of cost-effectiveness analysis results based on the clinical evidence from a certain homogeneous population as sensitivity or scenario analysis upon data availability.

  3. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  4. Development of a fast isocratic LC-MS/MS method for the high-throughput analysis of pyrrolizidine alkaloids in Australian honey.

    Science.gov (United States)

    Griffin, Caroline T; Mitrovic, Simon M; Danaher, Martin; Furey, Ambrose

    2015-01-01

    Honey samples originating from Australia were purchased and analysed for targeted pyrrolizidine alkaloids (PAs) using a new and rapid isocratic LC-MS/MS method. This isocratic method was developed from, and is comparable with, a gradient elution method and resulted in no loss of sensitivity or reduction in chromatographic peak shape. Isocratic elution allows for significantly shorter run times (6 min), eliminates the requirement for column equilibration periods and, thus, has the advantage of facilitating a high-throughput analysis which is particularly important for regulatory testing laboratories. In excess of two hundred injections are possible, with this new isocratic methodology, within a 24-h period which is more than 50% improvement on all previously published methodologies. Good linear calibrations were obtained for all 10 PAs and four PA N-oxides (PANOs) in spiked honey samples (3.57-357.14 µg l(-1); R(2) ≥ 0.9987). Acceptable inter-day repeatability was achieved for the target analytes in honey with % RSD values (n = 4) less than 7.4%. Limits of detection (LOD) and limits of quantitation (LOQ) were achieved with spiked PAs and PANOs samples; giving an average LOD of 1.6 µg kg(-1) and LOQ of 5.4 µg kg(-1). This method was successfully applied to Australian and New Zealand honey samples sourced from supermarkets in Australia. Analysis showed that 41 of the 59 honey samples were contaminated by PAs with the mean total sum of PAs being 153 µg kg(-1). Echimidine and lycopsamine were predominant and found in 76% and 88%, respectively, of the positive samples. The average daily exposure, based on the results presented in this study, were 0.051 µg kg(-1) bw day(-1) for adults and 0.204 µg kg(-1) bw day(-1) for children. These results are a cause for concern when compared with the proposed European Food Safety Authority (EFSA), Committee on Toxicity (COT) and Bundesinstitut für Risikobewertung (BfR - Federal Institute of Risk Assessment Germany) maximum

  5. Assessing the value of mepolizumab for severe eosinophilic asthma: a cost-effectiveness analysis.

    Science.gov (United States)

    Whittington, Melanie D; McQueen, R Brett; Ollendorf, Daniel A; Tice, Jeffrey A; Chapman, Richard H; Pearson, Steven D; Campbell, Jonathan D

    2017-02-01

    Adding mepolizumab to standard treatment with inhaled corticosteroids and controller medications could decrease asthma exacerbations and use of long-term oral steroids in patients with severe disease and increased eosinophils; however, mepolizumab is costly and its cost effectiveness is unknown. To estimate the cost effectiveness of mepolizumab. A Markov model was used to determine the incremental cost per quality-adjusted life year (QALY) gained for mepolizumab plus standard of care (SoC) and for SoC alone. The population, adults with severe eosinophilic asthma, was modeled for a lifetime time horizon. A responder scenario analysis was conducted to determine the cost effectiveness for a cohort able to achieve and maintain asthma control. Over a lifetime treatment horizon, 23.96 exacerbations were averted per patient receiving mepolizumab plus SoC. Avoidance of exacerbations and decrease in long-term oral steroid use resulted in more than $18,000 in cost offsets among those receiving mepolizumab, but treatment costs increased by more than $600,000. Treatment with mepolizumab plus SoC vs SoC alone resulted in a cost-effectiveness estimate of $386,000 per QALY. To achieve cost effectiveness of approximately $150,000 per QALY, mepolizumab would require a more than 60% price discount. At current pricing, treating a responder cohort yielded cost-effectiveness estimates near $160,000 per QALY. The estimated cost effectiveness of mepolizumab exceeds value thresholds. Achieving these thresholds would require significant discounts from the current list price. Alternatively, treatment limited to responders improves the cost effectiveness toward, but remains still slightly above, these thresholds. Payers interested in improving the efficiency of health care resources should consider negotiations of the mepolizumab price and ways to predict and assess the response to mepolizumab. Copyright © 2016 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All

  6. High Throughput Plasma Water Treatment

    Science.gov (United States)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  7. Investigation of DNA damage response and apoptotic gene methylation pattern in sporadic breast tumors using high throughput quantitative DNA methylation analysis technology

    Directory of Open Access Journals (Sweden)

    Prakash Neeraj

    2010-11-01

    Full Text Available Abstract Background- Sporadic breast cancer like many other cancers is proposed to be a manifestation of abnormal genetic and epigenetic changes. For the past decade our laboratory has identified genes involved in DNA damage response (DDR, apoptosis and immunesurvelliance pathways to influence sporadic breast cancer risk in north Indian population. Further to enhance our knowledge at the epigenetic level, we performed DNA methylation study involving 17 gene promoter regions belonging to DNA damage response (DDR and death receptor apoptotic pathway in 162 paired normal and cancerous breast tissues from 81 sporadic breast cancer patients, using a high throughput quantitative DNA methylation analysis technology. Results- The study identified five genes with statistically significant difference between normal and tumor tissues. Hypermethylation of DR5 (P = 0.001, DCR1 (P = 0.00001, DCR2 (P = 0.0000000005 and BRCA2 (P = 0.007 and hypomethylation of DR4 (P = 0.011 in sporadic breast tumor tissues suggested a weak/aberrant activation of the DDR/apoptotic pathway in breast tumorigenesis. Negative correlation was observed between methylation status and transcript expression levels for TRAIL, DR4, CASP8, ATM, CHEK2, BRCA1 and BRCA2 CpG sites. Categorization of the gene methylation with respect to the clinicopathological parameters showed an increase in aberrant methylation pattern in advanced tumors. These uncharacteristic methylation patterns corresponded with decreased death receptor apoptosis (P = 0.047 and DNA damage repair potential (P = 0.004 in advanced tumors. The observation of BRCA2 -26 G/A 5'UTR polymorphism concomitant with the presence of methylation in the promoter region was novel and emerged as a strong candidate for susceptibility to sporadic breast tumors. Conclusion- Our study indicates that methylation of DDR-apoptotic gene promoters in sporadic breast cancer is not a random phenomenon. Progressive epigenetic alterations in advancing

  8. Identification of MicroRNAs and Their Target Genes Related to the Accumulation of Anthocyanins in Litchi chinensis by High-Throughput Sequencing and Degradome Analysis

    Science.gov (United States)

    Liu, Rui; Lai, Biao; Hu, Bing; Qin, Yonghua; Hu, Guibing; Zhao, Jietang

    2017-01-01

    Litchi (Litchi chinensis Sonn.) is an important subtropical fruit in southern China and the fruit pericarp has attractive red skin at maturity, which is provided by anthocyanins accumulation. To understand the anthocyanin biosynthesis at post-transcriptional level, we investigated the roles of microRNAs (miRNAs) during fruit coloring. In the present study, four small RNA libraries and a mixed degradome library from pericarps of ‘Feizixiao’ litchi at different developmental phases were constructed and sequenced by Solexa technology. A total of 78 conserved miRNAs belonging to 35 miRNA families and 41 novel miRNAs were identified via high-throughput sequencing, and 129 genes were identified as their targets by the recently developed degradome sequencing. miR156a and a novel microRNA (NEW41) were found to be differentially expressed during fruit coloring, indicating they might affect anthocyanin biosynthesis through their target genes in litchi. qRT-PCR analysis confirmed the expression changes of miR156a and the novel microRNA (NEW41) were inversely correlated with the expression profiles of their target genes LcSPL1/2 and LcCHI, respectively, suggesting regulatory roles of these miRNAs during anthocyanin biosynthesis. The target genes of miR156a, LcSPL1/2, encode transcription factors, as evidenced by a localization in the nucleus, that might play roles in the regulation of transcription. To further explore the relationship of LcSPL1/2 with the anthocyanin regulatory genes, yeast two-hybrid and BiFC analyses showed that LcSPL1 proteins could interact with LcMYB1, which is the key regulatory gene in anthocyanin biosynthesis in litchi. This study represents a comprehensive expression profiling of miRNAs in anthocyanin biosynthesis during litchi fruit maturity and confirmed that the miR156- SPLs module was conserved in anthocyanin biosynthesis in litchi. PMID:28119728

  9. High-Throughput Content-Based Video Analysis Technologies%高通量视频内容分析技术

    Institute of Scientific and Technical Information of China (English)

    唐胜; 高科; 顾晓光; 颜成钢; 张勇东

    2014-01-01

    Under the environment of Big Data, how to analyze the content of high concurrent video data is a scientific problem which requires urgent solution. In this paper, we introduce the technologies about high-throughput content-based video analysis for content-based monitoring of web images and videos. We give an in-tensive survey on the state of the developments and trends in four key technologies:efficient video decoding and feature extraction with mass-core processors, and high-dimensional indexing and semantic recognition on distrib-uted systems. Furthermore, we introduce our latest research works on these technologies:parallel deblocking filter on mass-core processor, extraction and mining of highly robust and parallel local features, high-dimensional dis-tributed indexing, ensemble learning for large scale data, so as to take full advantages of high performances of multi-grain parallel computing platforms for the purpose of providing key technologies for the important applica-tions such as Internet video monitoring and search, etc.%大数据环境下,如何对高并发的视频数据进行实时地分析处理,是一个亟待解决的科学问题。本文介绍了面向互联网视频内容监管的高通量视频内容分析技术,着重对其中的四个主要关键技术(基于众核的视频高速解码和视频特征提取、基于分布式系统的高维索引和语义识别)的研究现状和发展趋势进行了综述和总结,并介绍了作者在这四个主要关键技术研究的最新成果,主要包括面向众核处理器的并行环路滤波、高鲁棒性和高并行度的局部特征提取与挖掘、分布式高维索引、面向大数据的集成学习方法,以充分发挥多粒度并行硬件平台的高并行计算能力,为互联网视频内容监管、视频搜索等重要应用提供关键技术支撑。

  10. Users′ guide to the orthopaedic literature: What is a cost-effectiveness analysis?

    Directory of Open Access Journals (Sweden)

    Tanner Stephanie

    2008-01-01

    Full Text Available As the cost of healthcare continue to rise, orthopaedic surgeons are being pressured to practice cost-effective healthcare. Consequently, economic evaluation of treatment options are being reported more commonly in medical and surgical literature. As new orthopaedic procedures and treatments may improve patient outcome and function over traditional treatment options, the effect of the potentially higher costs of new treatments should be formally evaluated. Unfortunately, the resources available for healthcare spending are typically limited. Therefore, cost-effectiveness analyses have become an important and useful tool in informing which procedure or treatment to implement into practice. Cost-effectiveness analysis is a type of economic analysis that compares both the clinical outcomes and the costs of new treatment options to current treatment options or standards of care. For a clinician to be able to apply the results of a cost-effectiveness analysis to their practice, they must be able to critically review the available literature. Conducting an economic analysis is a challenging process, which has resulted in a number of published economic analyses that are of lower quality and may be fraught with bias. It is important that the reader of an economic analysis or cost-effectiveness analysis have the skills required to properly evaluate and critically appraise the methodology used before applying the recommendations to their practice. Using the principles of evidence-based medicine and the questions outlined in the Journal of the American Medical Association′s Users′ Guide to the Medical Literature, this article attempts to illustrate how to critically appraise a cost-effectiveness analysis in the orthopaedic surgery literature.

  11. Cost-Effectiveness Analysis of Second-Line Chemotherapy Agents for Advanced Gastric Cancer.

    Science.gov (United States)

    Lam, Simon W; Wai, Maya; Lau, Jessica E; McNamara, Michael; Earl, Marc; Udeh, Belinda

    2017-01-01

    Gastric cancer is the fifth most common malignancy and second leading cause of cancer-related mortality. Chemotherapy options for patients who fail first-line treatment are limited. Thus the objective of this study was to assess the cost-effectiveness of second-line treatment options for patients with advanced or metastatic gastric cancer. Cost-effectiveness analysis using a Markov model to compare the cost-effectiveness of six possible second-line treatment options for patients with advanced gastric cancer who have failed previous chemotherapy: irinotecan, docetaxel, paclitaxel, ramucirumab, paclitaxel plus ramucirumab, and palliative care. The model was performed from a third-party payer's perspective to compare lifetime costs and health benefits associated with studied second-line therapies. Costs included only relevant direct medical costs. The model assumed chemotherapy cycle lengths of 30 days and a maximum number of 24 cycles. Systematic review of literature was performed to identify clinical data sources and utility and cost data. Quality-adjusted life years (QALYs) and incremental cost-effectiveness ratios (ICERs) were calculated. The primary outcome measure for this analysis was the ICER between different therapies, where the incremental cost was divided by the number of QALYs saved. The ICER was compared with a willingness-to-pay (WTP) threshold that was set at $50,000/QALY gained, and an exploratory analysis using $160,000/QALY gained was also used. The model's robustness was tested by using 1-way sensitivity analyses and a 10,000 Monte Carlo simulation probabilistic sensitivity analysis (PSA). Irinotecan had the lowest lifetime cost and was associated with a QALY gain of 0.35 year. Docetaxel, ramucirumab alone, and palliative care were dominated strategies. Paclitaxel and the combination of paclitaxel plus ramucirumab led to higher QALYs gained, at an incremental cost of $86,815 and $1,056,125 per QALY gained, respectively. Based on our prespecified

  12. [Cost-effectiveness analysis and diet quality index applied to the WHO Global Strategy].

    Science.gov (United States)

    Machado, Flávia Mori Sarti; Simões, Arlete Naresse

    2008-02-01

    To test the use of cost-effectiveness analysis as a decision making tool in the production of meals for the inclusion of the recommendations published in the World Health Organization's Global Strategy. Five alternative options for breakfast menu were assessed previously to their adoption in a food service at a university in the state of Sao Paulo, Southeastern Brazil, in 2006. Costs of the different options were based on market prices of food items (direct cost). Health benefits were estimated based on adaptation of the Diet Quality Index (DQI). Cost-effectiveness ratios were estimated by dividing benefits by costs and incremental cost-effectiveness ratios were estimated as cost differential per unit of additional benefit. The meal choice was based on health benefit units associated to direct production cost as well as incremental effectiveness per unit of differential cost. The analysis showed the most simple option with the addition of a fruit (DQI = 64 / cost = R$ 1.58) as the best alternative. Higher effectiveness was seen in the options with a fruit portion (DQI1=64 / DQI3=58 / DQI5=72) compared to the others (DQI2=48 / DQI4=58). The estimate of cost-effectiveness ratio allowed to identifying the best breakfast option based on cost-effectiveness analysis and Diet Quality Index. These instruments allow easy application easiness and objective evaluation which are key to the process of inclusion of public or private institutions under the Global Strategy directives.

  13. COST-EFFECTIVENESS ANALYSIS OF ANTI-DIABETIC THERAPY IN A UNIVERSITY TEACHING HOSPITAL

    Directory of Open Access Journals (Sweden)

    Giwa Abdulganiyu

    2014-03-01

    Full Text Available Purpose: To conduct cost-effectiveness analysis of anti-diabetic therapy in a University Teaching Hospital in 2010. Methods: A retrospective review of selected case-notes was conducted. World Health Organization Defined Daily Dose Method of evaluating drug use and probability method for potential effectiveness of antidiabetic therapeutic options from literature analysis was employed in determining cost-effectiveness of each anti-diabetic therapeutic option identified from anti-diabetic drug utilization studies. Sample Size, n=1200. Subjects’ case-notes were selected by systematic random sampling (Sampling Interval = 1. Results: Glibenclamide (N1.76/unit of effectiveness which was more cost-effective than chlopropamide (N2.97/unit of effectiveness in the management of moderate hyperglycemia in non-obese Type II Diabetes Mellitus was more frequently prescribed (81.5%. Glibenclamide + Metformin (N7.63/unit of effectiveness which was more frequently prescribed (92.5% was not necessarily more cost-effective than Chlopropamide + Metformin (N9.76/unit of effectiveness in the management of moderate hyperglycemia in obese Type II Diabetes- Mellitus. Biphasic Isophane Insulin (N12.65/unit of effectiveness which was more cost-effective than soluble insulin + insulin zinc (N30.37/unit of effectiveness in the management of serve hyperglycemia in non-obese Type II Diabetes Mellitus was less frequently prescribed (42.3%. Biphasic Isophane Insulin + Metformin (N15.91/unit of effectiveness which was more cost-effective than soluble insulin + insulin zinc + metformin (N34.45/ unit of effectiveness in the management of severe hyperglycemia in obese Type II Diabetes Mellitus patients was less frequently prescribed (25%. Conclusions: Prescription of lees cost-effective anti-diabetic drugs was rampant in Hospitals.

  14. A model-based cost-effectiveness analysis of osteoporosis screening and treatment strategy for postmenopausal Japanese women.

    Science.gov (United States)

    Yoshimura, M; Moriwaki, K; Noto, S; Takiguchi, T

    2017-02-01

    Although an osteoporosis screening program has been implemented as a health promotion project in Japan, its cost-effectiveness has yet to be elucidated fully. We performed a cost-effectiveness analysis and found that osteoporosis screening and treatment would be cost-effective for Japanese women over 60 years.

  15. A high-throughput neutron spectrometer

    Science.gov (United States)

    Stampfl, Anton; Noakes, Terry; Bartsch, Friedl; Bertinshaw, Joel; Veliscek-Carolan, Jessica; Nateghi, Ebrahim; Raeside, Tyler; Yethiraj, Mohana; Danilkin, Sergey; Kearley, Gordon

    2010-03-01

    A cross-disciplinary high-throughput neutron spectrometer is currently under construction at OPAL, ANSTO's open pool light-water research reactor. The spectrometer is based on the design of a Be-filter spectrometer (FANS) that is operating at the National Institute of Standards research reactor in the USA. The ANSTO filter-spectrometer will be switched in and out with another neutron spectrometer, the triple-axis spectrometer, Taipan. Thus two distinct types of neutron spectrometers will be accessible: one specialised to perform phonon dispersion analysis and the other, the filter-spectrometer, designed specifically to measure vibrational density of states. A summary of the design will be given along with a detailed ray-tracing analysis. Some preliminary results will be presented from the spectrometer.

  16. THE EARLY BIRD CATCHES THE WORM : EARLY COST-EFFECTIVENESS ANALYSIS OF NEW MEDICAL TESTS

    NARCIS (Netherlands)

    Buisman, Leander R; Rutten-van Mölken, Maureen P M H; Postmus, Douwe; Luime, Jolanda J; Uyl-de Groot, Carin A; Redekop, William K

    2016-01-01

    OBJECTIVES: There is little specific guidance on performing an early cost-effectiveness analysis (CEA) of medical tests. We developed a framework with general steps and applied it to two cases. METHODS: Step 1 is to narrow down the scope of analysis by defining the test's application, target populat

  17. Stool DNA Analysis is Cost-Effective for Colorectal Cancer Surveillance in Patients With Ulcerative Colitis.

    Science.gov (United States)

    Kisiel, John B; Konijeti, Gauree G; Piscitello, Andrew J; Chandra, Tarun; Goss, Thomas F; Ahlquist, David A; Farraye, Francis A; Ananthakrishnan, Ashwin N

    2016-12-01

    Patients with chronic ulcerative colitis are at increased risk for colorectal neoplasia (CRN). Surveillance by white-light endoscopy (WLE) or chromoendoscopy may reduce risk of CRN, but these strategies are underused. Analysis of DNA from stool samples (sDNA) can detect CRN with high levels of sensitivity, but it is not clear if this approach is cost-effective. We simulated these strategies for CRN detection to determine which approach is most cost-effective. We adapted a previously published Markov model to simulate the clinical course of chronic ulcerative colitis, the incidence of cancer or dysplasia, and costs and benefits of care with 4 surveillance strategies: (1) analysis of sDNA and diagnostic chromoendoscopy for patients with positive results, (2) analysis of sDNA with diagnostic WLE for patients with positive results, (3) chromoendoscopy with targeted collection of biopsies, or (4) WLE with random collection of biopsies. Costs were based on 2014 Medicare reimbursement. The primary outcome was the incremental cost-effectiveness ratio (incremental cost/incremental difference in quality-adjusted life-years) compared with no surveillance and a willingness-to-pay threshold of $50,000. All strategies fell below the willingness-to-pay threshold at 2-year intervals. Incremental cost-effectiveness ratios were $16,362 per quality-adjusted life-year for sDNA analysis with diagnostic chromoendoscopy; $18,643 per quality-adjusted life-year for sDNA analysis with diagnostic WLE; $23,830 per quality-adjusted life-year for chromoendoscopy alone; and $27,907 per quality-adjusted life-year for WLE alone. In sensitivity analyses, sDNA analysis with diagnostic chromoendoscopy was more cost-effective than chromoendoscopy alone, up to a cost of $1135 per sDNA test. sDNA analysis remained cost-effective at all rates of compliance; when combined with diagnostic chromoendoscopy, this approach was preferred over chromoendoscopy alone, when the specificity of the sDNA test for CRN

  18. Economic methods for valuing the outcomes of genetic testing: beyond cost-effectiveness analysis.

    Science.gov (United States)

    Grosse, Scott D; Wordsworth, Sarah; Payne, Katherine

    2008-09-01

    Genetic testing in health care can provide information to help with disease prediction, diagnosis, prognosis, and treatment. Assessing the clinical utility of genetic testing requires a process to value and weight different outcomes. This article discusses the relative merits of different economic measures and methods to inform recommendations relative to genetic testing for risk of disease, including cost-effectiveness analysis and cost-benefit analysis. Cost-effectiveness analyses refer to analyses that calculate the incremental cost per unit of health outcomes, such as deaths prevented or life-years saved because of some intervention. Cost-effectiveness analyses that use preference-based measures of health state utility such as quality-adjusted life-years to define outcomes are referred to as cost-utility analyses. Cost-effectiveness analyses presume that health policy decision makers seek to maximize health subject to resource constraints. Cost-benefit analyses can incorporate monetary estimates of willingness-to-pay for genetic testing, including the perceived value of information independent of health outcomes. These estimates can be derived from contingent valuation or discrete choice experiments. Because important outcomes of genetic testing do not fit easily within traditional measures of health, cost-effectiveness analyses do not necessarily capture the full range of outcomes of genetic testing that are important to decision makers and consumers. We recommend that health policy decision makers consider the value to consumers of information and other nonhealth attributes of genetic testing strategies.

  19. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  20. Comparing five alternative methods of breast reconstruction surgery: a cost-effectiveness analysis.

    Science.gov (United States)

    Grover, Ritwik; Padula, William V; Van Vliet, Michael; Ridgway, Emily B

    2013-11-01

    The purpose of this study was to assess the cost-effectiveness of five standardized procedures for breast reconstruction to delineate the best reconstructive approach in postmastectomy patients in the settings of nonirradiated and irradiated chest walls. A decision tree was used to model five breast reconstruction procedures from the provider perspective to evaluate cost-effectiveness. Procedures included autologous flaps with pedicled tissue, autologous flaps with free tissue, latissimus dorsi flaps with breast implants, expanders with implant exchange, and immediate implant placement. All methods were compared with a "do-nothing" alternative. Data for model parameters were collected through a systematic review, and patient health utilities were calculated from an ad hoc survey of reconstructive surgeons. Results were measured in cost (2011 U.S. dollars) per quality-adjusted life-year. Univariate sensitivity analyses and Bayesian multivariate probabilistic sensitivity analysis were conducted. Pedicled autologous tissue and free autologous tissue reconstruction were cost-effective compared with the do-nothing alternative. Pedicled autologous tissue was the slightly more cost-effective of the two. The other procedures were not found to be cost-effective. The results were robust to a number of sensitivity analyses, although the margin between pedicled and free autologous tissue reconstruction is small and affected by some parameter values. Autologous pedicled tissue was slightly more cost-effective than free tissue reconstruction in irradiated and nonirradiated patients. Implant-based techniques were not cost-effective. This is in agreement with the growing trend at academic institutions to encourage autologous tissue reconstruction because of its natural recreation of the breast contour, suppleness, and resiliency in the setting of irradiated recipient beds.

  1. An Optimized High Throughput Clean-Up Method Using Mixed-Mode SPE Plate for the Analysis of Free Arachidonic Acid in Plasma by LC-MS/MS

    OpenAIRE

    Wan Wang; Suzi Qin; Linsen Li; Xiaohua Chen; Qunjie Wang; Junfu Wei

    2015-01-01

    A high throughput sample preparation method was developed utilizing mixed-mode solid phase extraction (SPE) in 96-well plate format for the determination of free arachidonic acid in plasma by LC-MS/MS. Plasma was mixed with 3% aqueous ammonia and loaded into each well of 96-well plate. After washing with water and methanol sequentially, 3% of formic acid in acetonitrile was used to elute arachidonic acid. The collected fraction was injected onto a reversed phase column at 30°C with mobile pha...

  2. Cost-effectiveness analysis of therapeutic options for chronic hepatitis C genotype 3 infected patients.

    Science.gov (United States)

    Gimeno-Ballester, Vicente; Mar, Javier; O'Leary, Aisling; Adams, Róisín; San Miguel, Ramón

    2017-01-01

    This study provides a cost-effectiveness analysis of therapeutic strategies for chronic hepatitis C genotype 3 infected patients in Spain. A Markov model was designed to simulate the progression in a cohort of patients aged 50 years over a lifetime horizon. Sofosbuvir (SOF) plus peginterferon and ribavirin for 12 weeks was a cost-effective option when compared to standard of care (SoC) in the treatment of both 'moderate fibrosis' and 'cirrhotic' patients. Incremental cost-effectiveness ratios were €35,276/QALY and €18,374/QALY respectively. ICERs for SOF plus daclatasvir (DCV) regimens versus SoC were over the threshold limit considered, at €56,178/QALY and €77,378/QALY for 'moderate fibrosis' and 'cirrhotic' patients respectively. Addition of SOF to IFN-based regimens for genotype 3 was cost-effective for both 'moderate fibrosis' and 'cirrhotic' patients. IFN-free options including SOF and DCV association required price reductions lower than the list prices to be considered cost-effective.

  3. High-throughput sequencing and analysis of the gill tissue transcriptome from the deep-sea hydrothermal vent mussel Bathymodiolus azoricus

    Directory of Open Access Journals (Sweden)

    Gomes Paula

    2010-10-01

    Full Text Available Abstract Background Bathymodiolus azoricus is a deep-sea hydrothermal vent mussel found in association with large faunal communities living in chemosynthetic environments at the bottom of the sea floor near the Azores Islands. Investigation of the exceptional physiological reactions that vent mussels have adopted in their habitat, including responses to environmental microbes, remains a difficult challenge for deep-sea biologists. In an attempt to reveal genes potentially involved in the deep-sea mussel innate immunity we carried out a high-throughput sequence analysis of freshly collected B. azoricus transcriptome using gills tissues as the primary source of immune transcripts given its strategic role in filtering the surrounding waterborne potentially infectious microorganisms. Additionally, a substantial EST data set was produced and from which a comprehensive collection of genes coding for putative proteins was organized in a dedicated database, "DeepSeaVent" the first deep-sea vent animal transcriptome database based on the 454 pyrosequencing technology. Results A normalized cDNA library from gills tissue was sequenced in a full 454 GS-FLX run, producing 778,996 sequencing reads. Assembly of the high quality reads resulted in 75,407 contigs of which 3,071 were singletons. A total of 39,425 transcripts were conceptually translated into amino-sequences of which 22,023 matched known proteins in the NCBI non-redundant protein database, 15,839 revealed conserved protein domains through InterPro functional classification and 9,584 were assigned with Gene Ontology terms. Queries conducted within the database enabled the identification of genes putatively involved in immune and inflammatory reactions which had not been previously evidenced in the vent mussel. Their physical counterpart was confirmed by semi-quantitative quantitative Reverse-Transcription-Polymerase Chain Reactions (RT-PCR and their RNA transcription level by quantitative PCR (q

  4. Cost-effectiveness analysis of a statewide media campaign to promote adolescent physical activity.

    Science.gov (United States)

    Peterson, Michael; Chandlee, Margaret; Abraham, Avron

    2008-10-01

    A cost-effectiveness analysis of a statewide social marketing campaign was performed using a statewide surveillance survey distributed to 6th through 12th graders, media production and placement costs, and 2000 census data. Exposure to all three advertisements had the highest impact on both intent and behavior with 65.6% of the respondents considering becoming more active and 58.3% reporting becoming more active. Average cost of the entire campaign was $4.01 per person to see an ad, $7.35 per person to consider being more active, and $8.87 per person to actually become more active, with billboards yielding the most positive cost-effectiveness. Findings highlight market research as an essential part of social marketing campaigns and the importance of using multiple marketing modalities to enhance cost-effectiveness and impact.

  5. Incremental Cost-Effectiveness Analysis of Gestational Diabetes Mellitus Screening Strategies in Singapore.

    Science.gov (United States)

    Chen, Pin Yu; Finkelstein, Eric A; Ng, Mor Jack; Yap, Fabian; Yeo, George S H; Rajadurai, Victor Samuel; Chong, Yap Seng; Gluckman, Peter D; Saw, Seang Mei; Kwek, Kenneth Y C; Tan, Kok Hian

    2016-01-01

    The objective of this study was to conduct an incremental cost-effectiveness analysis from the payer's perspective in Singapore of 3 gestational diabetes mellitus screening strategies: universal, targeted, or no screening. A decision tree model assessed the primary outcome: incremental cost per quality-adjusted life year (QALY) gained. Probabilities, costs, and utilities were derived from the literature, the Growing Up in Singapore Towards healthy Outcomes (GUSTO) birth cohort study, and the KK Women's and Children's Hospital's database. Relative to targeted screening using risk factors, universal screening generates an incremental cost-effectiveness ratio (ICER) of $USD10,630/QALY gained. Sensitivity analyses show that disease prevalence rates and intervention effectiveness of glycemic management have the biggest impacts on the ICERs. Based on the model and best available data, universal screening is a cost-effective approach for reducing the complications of gestational diabetes mellitus in Singapore as compared with the targeted screening approach or no screening.

  6. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique.

    Science.gov (United States)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing.

  7. An improved cell separation technique for marine subsurface sediments: applications for high-throughput analysis using flow cytometry and cell sorting.

    Science.gov (United States)

    Morono, Yuki; Terada, Takeshi; Kallmeyer, Jens; Inagaki, Fumio

    2013-10-01

    Development of an improved technique for separating microbial cells from marine sediments and standardization of a high-throughput and discriminative cell enumeration method were conducted. We separated microbial cells from various types of marine sediment and then recovered the cells using multilayer density gradients of sodium polytungstate and/or Nycodenz, resulting in a notably higher percent recovery of cells than previous methods. The efficiency of cell extraction generally depends on the sediment depth; using the new technique we developed, more than 80% of the total cells were recovered from shallow sediment samples (down to 100 meters in depth), whereas ~50% of cells were recovered from deep samples (100-365 m in depth). The separated cells could be rapidly enumerated using flow cytometry (FCM). The data were in good agreement with those obtained from manual microscopic direct counts over the range 10(4)-10(8) cells cm(-3). We also demonstrated that sedimentary microbial cells can be efficiently collected using a cell sorter. The combined use of our new cell separation and FCM/cell sorting techniques facilitates high-throughput and precise enumeration of microbial cells in sediments and is amenable to various types of single-cell analyses, thereby enhancing our understanding of microbial life in the largely uncharacterized deep subseafloor biosphere.

  8. Spectrophotometric Analysis of Pigments: A Critical Assessment of a High-Throughput Method for Analysis of Algal Pigment Mixtures by Spectral Deconvolution.

    Directory of Open Access Journals (Sweden)

    Jan-Erik Thrane

    Full Text Available The Gauss-peak spectra (GPS method represents individual pigment spectra as weighted sums of Gaussian functions, and uses these to model absorbance spectra of phytoplankton pigment mixtures. We here present several improvements for this type of methodology, including adaptation to plate reader technology and efficient model fitting by open source software. We use a one-step modeling of both pigment absorption and background attenuation with non-negative least squares, following a one-time instrument-specific calibration. The fitted background is shown to be higher than a solvent blank, with features reflecting contributions from both scatter and non-pigment absorption. We assessed pigment aliasing due to absorption spectra similarity by Monte Carlo simulation, and used this information to select a robust set of identifiable pigments that are also expected to be common in natural samples. To test the method's performance, we analyzed absorbance spectra of pigment extracts from sediment cores, 75 natural lake samples, and four phytoplankton cultures, and compared the estimated pigment concentrations with concentrations obtained using high performance liquid chromatography (HPLC. The deviance between observed and fitted spectra was generally very low, indicating that measured spectra could successfully be reconstructed as weighted sums of pigment and background components. Concentrations of total chlorophylls and total carotenoids could accurately be estimated for both sediment and lake samples, but individual pigment concentrations (especially carotenoids proved difficult to resolve due to similarity between their absorbance spectra. In general, our modified-GPS method provides an improvement of the GPS method that is a fast, inexpensive, and high-throughput alternative for screening of pigment composition in samples of phytoplankton material.

  9. Prevention and treatment of cardiovascular disease in Ethiopia: a cost-effectiveness analysis.

    Science.gov (United States)

    Tolla, Mieraf Taddesse; Norheim, Ole Frithjof; Memirie, Solomon Tessema; Abdisa, Senbeta Guteta; Ababulgu, Awel; Jerene, Degu; Bertram, Melanie; Strand, Kirsten; Verguet, Stéphane; Johansson, Kjell Arne

    2016-01-01

    The coverage of prevention and treatment strategies for ischemic heart disease and stroke is very low in Ethiopia. In view of Ethiopia's meager healthcare budget, it is important to identify the most cost-effective interventions for further scale-up. This paper's objective is to assess cost-effectiveness of prevention and treatment of ischemic heart disease (IHD) and stroke in an Ethiopian setting. Fifteen single interventions and sixteen intervention packages were assessed from a healthcare provider perspective. The World Health Organization's Choosing Interventions that are Cost-Effective model for cardiovascular disease was updated with available country-specific inputs, including demography, mortality and price of traded and non-traded goods. Costs and health benefits were discounted at 3 % per year. Incremental cost-effectiveness ratios are reported in US$ per disability adjusted life year (DALY) averted. Sensitivity analysis was undertaken to assess robustness of our results. Combination drug treatment for individuals having >35 % absolute risk of a CVD event in the next 10 years is the most cost-effective intervention. This intervention costs US$67 per DALY averted and about US$7 million annually. Treatment of acute myocardial infarction (AMI) (costing US$1000-US$7530 per DALY averted) and secondary prevention of IHD and stroke (costing US$1060-US$10,340 per DALY averted) become more efficient when delivered in integrated packages. At an annual willingness-to-pay (WTP) level of about US$3 million, a package consisting of aspirin, streptokinase, ACE-inhibitor and beta-blocker for AMI has the highest probability of being most cost-effective, whereas as WTP increases to > US$7 million, combination drug treatment to individuals having >35 % absolute risk stands out as the most cost-effective strategy. Cost-effectiveness ratios were relatively more sensitive to halving the effectiveness estimates as compared with doubling the price of drugs and laboratory

  10. Cost effectiveness analysis of strategies for tuberculosis control in developing countries.

    NARCIS (Netherlands)

    Baltussen, R.M.P.M.; Floyd, K.; Dye, C.

    2005-01-01

    OBJECTIVE: To assess the costs and health effects of tuberculosis control interventions in Africa and South East Asia in the context of the millennium development goals. DESIGN: Cost effectiveness analysis based on an epidemiological model. SETTING: Analyses undertaken for two regions classified by

  11. Guiding the Development and Use of Cost-Effectiveness Analysis in Education

    Science.gov (United States)

    Levin, Henry M.; Belfield, Clive

    2015-01-01

    Cost-effectiveness analysis is rarely used in education. When it is used, it often fails to meet methodological standards, especially with regard to cost measurement. Although there are occasional criticisms of these failings, we believe that it is useful to provide a listing of the more common concerns and how they might be addressed. Based upon…

  12. 76 FR 7881 - Discount Rates for Cost-Effectiveness Analysis of Federal Programs

    Science.gov (United States)

    2011-02-11

    ... updated annually when the interest rate and inflation assumptions used to prepare the budget of the United... forecast of real interest rates from which the inflation premium has been removed and based on the economic...-dollar flows, as is often required in cost-effectiveness analysis. Real Interest Rates on Treasury...

  13. Economic viewpoints in educational effectiveness : Cost-effectiveness analysis of an educational improvement project

    NARCIS (Netherlands)

    Creemers, B; van der Werf, G

    2000-01-01

    Cost-effectiveness analysis is not only important for decision making in educational policy and practice. Also within educational effectiveness research it is important to establish the costs of educational processes in relationship to their effects. The integrated multilevel educational effectivene

  14. Cost-effectiveness Analysis of Rivaroxaban in the Secondary Prevention of Acute Coronary Syndromes in Sweden

    NARCIS (Netherlands)

    Begum, N.; Stephens, S.; Schoeman, O.; Fraschke, A.; Kirsch, B.; Briere, J.B.; Verheugt, F.W.A.; Hout, B.A. van

    2015-01-01

    BACKGROUND: Worldwide, coronary heart disease accounts for 7 million deaths each year. In Sweden, acute coronary syndrome (ACS) is a leading cause of hospitalization and is responsible for 1 in 4 deaths. OBJECTIVE: The aim of this analysis was to assess the cost-effectiveness of rivaroxaban 2.5 mg t

  15. An analysis of the cost effectiveness of replacing maize with wheat ...

    African Journals Online (AJOL)

    An analysis of the cost effectiveness of replacing maize with wheat offal in ... Open Access DOWNLOAD FULL TEXT ... At this level of wheat offal inclusion, feed cost per ton would be reduced by about 13.2% of the cost of the control diet.

  16. Cost-Effectiveness Analysis of Different Genetic Testing Strategies for Lynch Syndrome in Taiwan.

    Science.gov (United States)

    Chen, Ying-Erh; Kao, Sung-Shuo; Chung, Ren-Hua

    2016-01-01

    Patients with Lynch syndrome (LS) have a significantly increased risk of developing colorectal cancer (CRC) and other cancers. Genetic screening for LS among patients with newly diagnosed CRC aims to identify mutations in the disease-causing genes (i.e., the DNA mismatch repair genes) in the patients, to offer genetic testing for relatives of the patients with the mutations, and then to provide early prevention for the relatives with the mutations. Several genetic tests are available for LS, such as DNA sequencing for MMR genes and tumor testing using microsatellite instability and immunohistochemical analyses. Cost-effectiveness analyses of different genetic testing strategies for LS have been performed in several studies from different countries such as the US and Germany. However, a cost-effectiveness analysis for the testing has not yet been performed in Taiwan. In this study, we evaluated the cost-effectiveness of four genetic testing strategies for LS described in previous studies, while population-specific parameters, such as the mutation rates of the DNA mismatch repair genes and treatment costs for CRC in Taiwan, were used. The incremental cost-effectiveness ratios based on discounted life years gained due to genetic screening were calculated for the strategies relative to no screening and to the previous strategy. Using the World Health Organization standard, which was defined based on Taiwan's Gross Domestic Product per capita, the strategy based on immunohistochemistry as a genetic test followed by BRAF mutation testing was considered to be highly cost-effective relative to no screening. Our probabilistic sensitivity analysis results also suggest that the strategy has a probability of 0.939 of being cost-effective relative to no screening based on the commonly used threshold of $50,000 to determine cost-effectiveness. To the best of our knowledge, this is the first cost-effectiveness analysis for evaluating different genetic testing strategies for LS in

  17. FLASH assembly of TALENs for high-throughput genome editing.

    Science.gov (United States)

    Reyon, Deepak; Tsai, Shengdar Q; Khayter, Cyd; Foden, Jennifer A; Sander, Jeffry D; Joung, J Keith

    2012-05-01

    Engineered transcription activator–like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published, and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the fast ligation-based automatable solid-phase high-throughput (FLASH) system, a rapid and cost-effective method for large-scale assembly of TALENs. We tested 48 FLASH-assembled TALEN pairs in a human cell–based EGFP reporter system and found that all 48 possessed efficient gene-modification activities. We also used FLASH to assemble TALENs for 96 endogenous human genes implicated in cancer and/or epigenetic regulation and found that 84 pairs were able to efficiently introduce targeted alterations. Our results establish the robustness of TALEN technology and demonstrate that FLASH facilitates high-throughput genome editing at a scale not currently possible with other genome modification technologies.

  18. Comparative efficiency research (COMER): meta-analysis of cost-effectiveness studies.

    Science.gov (United States)

    Crespo, Carlos; Monleon, Antonio; Díaz, Walter; Ríos, Martín

    2014-12-22

    The aim of this study was to create a new meta-analysis method for cost-effectiveness studies using comparative efficiency research (COMER). We built a new score named total incremental net benefit (TINB), with inverse variance weighting of incremental net benefits (INB). This permits determination of whether an alternative is cost-effective, given a specific threshold (TINB > 0 test). Before validation of the model, the structure of dependence between costs and quality-adjusted life years (QoL) was analysed using copula distributions. The goodness-of-fit of a Spanish prospective observational study (n = 498) was analysed using the Independent, Gaussian, T, Gumbel, Clayton, Frank and Placket copulas. Validation was carried out by simulating a copula distribution with log-normal distribution for costs and gamma distribution for disutilities. Hypothetical cohorts were created by varying the sample size (n: 15-500) and assuming three scenarios (1-cost-effective; 2-non-cost-effective; 3-dominant). The COMER result was compared to the theoretical result according to the incremental cost-effectiveness ratio (ICER) and the INB, assuming a margin of error of 2,000 and 500 monetary units, respectively. The Frank copula with positive dependence (-0.4279) showed a goodness-of-fit sufficient to represent costs and QoL (p-values 0.524 and 0.808). The theoretical INB was within the 95% confidence interval of the TINB, based on 15 individuals with a probability > 80% for scenarios 1 and 2, and > 90% for scenario 3. The TINB > 0 test with 15 individuals showed p-values of 0.0105 (SD: 0.0411) for scenario 1, 0.613 (SD: 0.265) for scenario 2 and < 0.0001 for scenario 3. COMER is a valid tool for combining cost-effectiveness studies and may be of use to health decision makers.

  19. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  20. The clinical and cost effectiveness of group art therapy for people with non-psychotic mental health disorders: a systematic review and cost-effectiveness analysis.

    Science.gov (United States)

    Uttley, Lesley; Stevenson, Matt; Scope, Alison; Rawdin, Andrew; Sutton, Anthea

    2015-07-07

    The majority of mental health problems are non-psychotic (e.g., depression, anxiety, and phobias). For some people, art therapy may be a more acceptable alternative form of psychological therapy than standard forms of treatment, such as talking therapies. This study was part of a health technology assessment commissioned by the National Institute for Health Research, UK and aimed to systematically appraise the clinical and cost-effective evidence for art therapy for people with non-psychotic mental health disorders. Comprehensive literature searches for studies examining art therapy in populations with non-psychotic mental health disorders were performed in May 2013. A quantitative systematic review of clinical effectiveness and a systematic review of studies evaluating the cost-effectiveness of group art therapy were conducted. Eleven randomised controlled trials were included (533 patients). Meta-analysis was not possible due to clinical heterogeneity and insufficient comparable data on outcome measures across studies. The control groups varied between studies but included: no treatment/wait-list, attention placebo controls and psychological therapy comparators. Art therapy was associated with significant positive changes relative to the control group in mental health symptoms in 7 of the 11 studies. A de novo model was constructed and populated with data identified from the clinical review. Scenario analyses were conducted allowing comparisons of group art therapy with wait-list control and group art therapy with group verbal therapy. Group art-therapy appeared cost-effective compared with wait-list control with high certainty although generalisability to the target population was unclear; group verbal therapy appeared more cost-effective than art therapy but there was considerable uncertainty and a sizeable probability that art therapy was more cost effective. From the limited available evidence art therapy was associated with positive effects compared with

  1. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    Science.gov (United States)

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations.

  2. A comprehensive analysis of in vitro and in vivo genetic fitness of Pseudomonas aeruginosa using high-throughput sequencing of transposon libraries.

    Directory of Open Access Journals (Sweden)

    David Skurnik

    Full Text Available High-throughput sequencing of transposon (Tn libraries created within entire genomes identifies and quantifies the contribution of individual genes and operons to the fitness of organisms in different environments. We used insertion-sequencing (INSeq to analyze the contribution to fitness of all non-essential genes in the chromosome of Pseudomonas aeruginosa strain PA14 based on a library of ∼300,000 individual Tn insertions. In vitro growth in LB provided a baseline for comparison with the survival of the Tn insertion strains following 6 days of colonization of the murine gastrointestinal tract as well as a comparison with Tn-inserts subsequently able to systemically disseminate to the spleen following induction of neutropenia. Sequencing was performed following DNA extraction from the recovered bacteria, digestion with the MmeI restriction enzyme that hydrolyzes DNA 16 bp away from the end of the Tn insert, and fractionation into oligonucleotides of 1,200-1,500 bp that were prepared for high-throughput sequencing. Changes in frequency of Tn inserts into the P. aeruginosa genome were used to quantify in vivo fitness resulting from loss of a gene. 636 genes had <10 sequencing reads in LB, thus defined as unable to grow in this medium. During in vivo infection there were major losses of strains with Tn inserts in almost all known virulence factors, as well as respiration, energy utilization, ion pumps, nutritional genes and prophages. Many new candidates for virulence factors were also identified. There were consistent changes in the recovery of Tn inserts in genes within most operons and Tn insertions into some genes enhanced in vivo fitness. Strikingly, 90% of the non-essential genes were required for in vivo survival following systemic dissemination during neutropenia. These experiments resulted in the identification of the P. aeruginosa strain PA14 genes necessary for optimal survival in the mucosal and systemic environments of a mammalian

  3. Cost-effectiveness of antibiotics for COPD management: observational analysis using CPRD data

    Directory of Open Access Journals (Sweden)

    Sarah J. Ronaldson

    2017-06-01

    Full Text Available It is often difficult to determine the cause of chronic obstructive pulmonary disease (COPD exacerbations, and antibiotics are frequently prescribed. This study conducted an observational cost-effectiveness analysis of prescribing antibiotics for exacerbations of COPD based on routinely collected data from patient electronic health records. A cohort of 45 375 patients aged 40 years or more who attended their general practice for a COPD exacerbation during 2000–2013 was identified from the Clinical Practice Research Datalink. Two groups were formed (“immediate antibiotics” or “no antibiotics” based on whether antibiotics were prescribed during the index general practice (GP consultation, with data analysed according to subsequent healthcare resource use. A cost-effectiveness analysis was undertaken from the perspective of the UK National Health Service, using a time horizon of 4 weeks in the base case. The use of antibiotics for COPD exacerbations resulted in cost savings and an improvement in all outcomes analysed; i.e. GP visits, hospitalisations, community respiratory team referrals, all referrals, infections and subsequent antibiotics prescriptions were lower for the antibiotics group. Hence, the use of antibiotics was dominant over no antibiotics. The economic analysis suggests that use of antibiotics for COPD exacerbations is a cost-effective alternative to not prescribing antibiotics for patients who present to their GP, and remains cost-effective when longer time horizons of 3 months and 12 months are considered. It would be useful for a definitive trial to be undertaken in this area to determine the cost-effectiveness of antibiotics for COPD exacerbations.

  4. Cost-Effectiveness Analysis of an Automated Medication System Implemented in a Danish Hospital Setting.

    Science.gov (United States)

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    To evaluate the cost-effectiveness of an automated medication system (AMS) implemented in a Danish hospital setting. An economic evaluation was performed alongside a controlled before-and-after effectiveness study with one control ward and one intervention ward. The primary outcome measure was the number of errors in the medication administration process observed prospectively before and after implementation. To determine the difference in proportion of errors after implementation of the AMS, logistic regression was applied with the presence of error(s) as the dependent variable. Time, group, and interaction between time and group were the independent variables. The cost analysis used the hospital perspective with a short-term incremental costing approach. The total 6-month costs with and without the AMS were calculated as well as the incremental costs. The number of avoided administration errors was related to the incremental costs to obtain the cost-effectiveness ratio expressed as the cost per avoided administration error. The AMS resulted in a statistically significant reduction in the proportion of errors in the intervention ward compared with the control ward. The cost analysis showed that the AMS increased the ward's 6-month cost by €16,843. The cost-effectiveness ratio was estimated at €2.01 per avoided administration error, €2.91 per avoided procedural error, and €19.38 per avoided clinical error. The AMS was effective in reducing errors in the medication administration process at a higher overall cost. The cost-effectiveness analysis showed that the AMS was associated with affordable cost-effectiveness rates. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    Science.gov (United States)

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  6. Cost-effectiveness of cardiotocography plus ST analysis of the fetal electrocardiogram compared with cardiotocography only.

    Science.gov (United States)

    Vijgen, Sylvia M C; Westerhuis, Michelle E M H; Opmeer, Brent C; Visser, Gerard H A; Moons, Karl G M; Porath, Martina M; Oei, Guid S; Van Geijn, Herman P; Bolte, Antoinette C; Willekes, Christine; Nijhuis, Jan G; Van Beek, Erik; Graziosi, Giuseppe C M; Schuitemaker, Nico W E; Van Lith, Jan M M; Van Den Akker, Eline S A; Drogtrop, Addy P; Van Dessel, Hendrikus J H M; Rijnders, Robbert J P; Oosterbaan, Herman P; Mol, Ben Willem J; Kwee, Anneke

    2011-07-01

    To assess the cost-effectiveness of addition of ST analysis of the fetal electrocardiogram (ECG; STAN) to cardiotocography (CTG) for fetal surveillance during labor compared with CTG only. Cost-effectiveness analysis based on a randomized clinical trial on ST analysis of the fetal ECG. Obstetric departments of three academic and six general hospitals in The Netherlands. Population. Laboring women with a singleton high-risk pregnancy, a fetus in cephalic presentation, a gestational age >36 weeks and an indication for internal electronic fetal monitoring. A trial-based cost-effectiveness analysis was performed from a health-care provider perspective. Primary health outcome was the incidence of metabolic acidosis measured in the umbilical artery. Direct medical costs were estimated from start of labor to childbirth. Cost-effectiveness was expressed as costs to prevent one case of metabolic acidosis. The incidence of metabolic acidosis was 0.7% in the ST-analysis group and 1.0% in the CTG-only group (relative risk 0.70; 95% confidence interval 0.38-1.28). Per delivery, the mean costs per patient of CTG plus ST analysis (n= 2 827) were €1,345 vs. €1,316 for CTG only (n= 2 840), with a mean difference of €29 (95% confidence interval -€9 to €77) until childbirth. The incremental costs of ST analysis to prevent one case of metabolic acidosis were €9 667. The additional costs of monitoring by ST analysis of the fetal ECG are very limited when compared with monitoring by CTG only and very low compared with the total costs of delivery. © 2011 The Authors Acta Obstetricia et Gynecologica Scandinavica© 2011 Nordic Federation of Societies of Obstetrics and Gynecology.

  7. Cost-Effectiveness Analysis of Three Leprosy Case Detection Methods in Northern Nigeria

    Science.gov (United States)

    Ezenduka, Charles; Post, Erik; John, Steven; Suraj, Abdulkarim; Namadi, Abdulahi; Onwujekwe, Obinna

    2012-01-01

    Background Despite several leprosy control measures in Nigeria, child proportion and disability grade 2 cases remain high while new cases have not significantly reduced, suggesting continuous spread of the disease. Hence, there is the need to review detection methods to enhance identification of early cases for effective control and prevention of permanent disability. This study evaluated the cost-effectiveness of three leprosy case detection methods in Northern Nigeria to identify the most cost-effective approach for detection of leprosy. Methods A cross-sectional study was carried out to evaluate the additional benefits of using several case detection methods in addition to routine practice in two north-eastern states of Nigeria. Primary and secondary data were collected from routine practice records and the Nigerian Tuberculosis and Leprosy Control Programme of 2009. The methods evaluated were Rapid Village Survey (RVS), Household Contact Examination (HCE) and Traditional Healers incentive method (TH). Effectiveness was measured as number of new leprosy cases detected and cost-effectiveness was expressed as cost per case detected. Costs were measured from both providers' and patients' perspectives. Additional costs and effects of each method were estimated by comparing each method against routine practise and expressed as incremental cost-effectiveness ratio (ICER). All costs were converted to the U.S. dollar at the 2010 exchange rate. Univariate sensitivity analysis was used to evaluate uncertainties around the ICER. Results The ICER for HCE was $142 per additional case detected at all contact levels and it was the most cost-effective method. At ICER of $194 per additional case detected, THs method detected more cases at a lower cost than the RVS, which was not cost-effective at $313 per additional case detected. Sensitivity analysis showed that varying the proportion of shared costs and subsistent wage for valuing unpaid time did not significantly change the

  8. A cost-effectiveness analysis of long-term intermittent catheterisation with hydrophilic and uncoated catheters

    DEFF Research Database (Denmark)

    Clark, J F; Mealing, S J; Scott, D A

    2016-01-01

    includes the long-term sequelae of impaired renal function and urinary tract infection (UTI). SETTING: Analysis based on a UK perspective. METHODS: A probabilistic Markov decision model was constructed, to compare lifetime costs and quality-adjusted life years, taking renal and UTI health states......STUDY DESIGN: Cost-effectiveness analysisObjective:To establish a model to investigate the cost effectiveness for people with spinal cord injury (SCI), from a lifetime perspective, for the usage of two different single-use catheter designs: hydrophilic-coated (HC) and uncoated (UC). The model...... into consideration, as well as other catheter-related events. UTI event rates for the primary data set were based on data from hospital settings to ensure controlled and accurate reporting. A sensitivity analysis was applied to evaluate best- and worst-case scenarios. RESULTS: The model predicts that a 36-year...

  9. Automated high-throughput measurement of body movements and cardiac activity of Xenopus tropicalis tadpoles

    Directory of Open Access Journals (Sweden)

    Kay Eckelt

    2014-07-01

    Full Text Available Xenopus tadpoles are an emerging model for developmental, genetic and behavioral studies. A small size, optical accessibility of most of their organs, together with a close genetic and structural relationship to humans make them a convenient experimental model. However, there is only a limited toolset available to measure behavior and organ function of these animals at medium or high-throughput. Herein, we describe an imaging-based platform to quantify body and autonomic movements of Xenopus tropicalis tadpoles of advanced developmental stages. Animals alternate periods of quiescence and locomotor movements and display buccal pumping for oxygen uptake from water and rhythmic cardiac movements. We imaged up to 24 animals in parallel and automatically tracked and quantified their movements by using image analysis software. Animal trajectories, moved distances, activity time, buccal pumping rates and heart beat rates were calculated and used to characterize the effects of test compounds. We evaluated the effects of propranolol and atropine, observing a dose-dependent bradycardia and tachycardia, respectively. This imaging and analysis platform is a simple, cost-effective high-throughput in vivo assay system for genetic, toxicological or pharmacological characterizations.

  10. High-throughput DNA extraction of forensic adhesive tapes.

    Science.gov (United States)

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  11. Cost-effectiveness analysis of interventions for migraine in four low- and middle-income countries.

    Science.gov (United States)

    Linde, Mattias; Steiner, Timothy J; Chisholm, Dan

    2015-02-18

    Evidence of the cost and effects of interventions for reducing the global burden of migraine remains scarce. Our objective was to estimate the population-level cost-effectiveness of evidence-based migraine interventions and their contributions towards reducing current burden in low- and middle-income countries. Using a standard WHO approach to cost-effectiveness analysis (CHOICE), we modelled core set intervention strategies for migraine, taking account of coverage and efficacy as well as non-adherence. The setting was primary health care including pharmacies. We modelled 26 intervention strategies implemented during 10 years. These included first-line acute and prophylactic drugs, and the expected consequences of adding consumer-education and provider-training. Total population-level costs and effectiveness (healthy life years [HLY] gained) were combined to form average and incremental cost-effectiveness ratios. We executed runs of the model for the general populations of China, India, Russia and Zambia. Of the strategies considered, acute treatment of attacks with acetylsalicylic acid (ASA) was by far the most cost-effective and generated a HLY for less than US$ 100. Adding educational actions increased annual costs by 1-2 US cents per capita of the population. Cost-effectiveness ratios then became slightly less favourable but still less than US$ 100 per HLY gained for ASA. An incremental cost of > US$ 10,000 would have to be paid per extra HLY by adding a triptan in a stepped-care treatment paradigm. For prophylaxis, amitriptyline was more cost-effective than propranolol or topiramate. Self-management with simple analgesics was by far the most cost-effective strategy for migraine treatment in low- and middle-income countries and represents a highly efficient use of health resources. Consumer education and provider training are expected to accelerate progress towards desired levels of coverage and adherence, cost relatively little to implement, and can

  12. A cost-effectiveness analysis of screening urine dipsticks in well-child care.

    Science.gov (United States)

    Sekhar, Deepa L; Wang, Li; Hollenbeak, Christopher S; Widome, Mark D; Paul, Ian M

    2010-04-01

    Despite data suggesting that routine urine screening for chronic kidney disease (CKD) has low diagnostic yield and the American Academy of Pediatrics 2007 recommendation to discontinue this screening, pediatricians may not have recognized this change. Because the new recommendation marks a major alteration in the practice guidelines, we sought to evaluate the cost-effectiveness of dipstick urinalysis for detection of CKD from the primary care practitioner's perspective. Decision analysis was used to model a screening dipstick urinalysis strategy relative to a no-screening strategy. Data on the incidence of hematuria and proteinuria in children were derived from published reports of large cohorts of school-aged children. Direct costs were estimated from the perspective of the primary care practitioner. The measure of effectiveness was the rate of diagnoses of CKD. Cost-effectiveness was evaluated by using an incremental cost-effectiveness ratio. Expected costs and effectiveness for the no-screening strategy were 0 dollars because no resources were used and no cases of CKD were diagnosed. The screening strategy involved a cost per dipstick of 3.05 dollars. Accounting for both true-positive and false-positive initial screens, 14.2% of the patients required a second dipstick as per typical clinical care, bringing the expected cost of the screening strategy to 3.47 dollars per patient. In the screening strategy, 1 case of CKD was diagnosed per 800 screened, and the incremental cost-effectiveness ratio was 2779.50 dollars per case diagnosed. Urine dipstick is inexpensive, but it is a poor screening test for CKD and a cost-ineffective procedure for the primary care provider. These data support the change in the American Academy of Pediatrics guidelines on the use of screening dipstick urinalysis. Clinicians must consider the cost-effectiveness of preventive care procedures to make better use of available resources.

  13. A High-Throughput, Field-Based Phenotyping Technology for Tall Biomass Crops.

    Science.gov (United States)

    Salas Fernandez, Maria G; Bao, Yin; Tang, Lie; Schnable, Patrick S

    2017-08-01

    Recent advances in omics technologies have not been accompanied by equally efficient, cost-effective, and accurate phenotyping methods required to dissect the genetic architecture of complex traits. Even though high-throughput phenotyping platforms have been developed for controlled environments, field-based aerial and ground technologies have only been designed and deployed for short-stature crops. Therefore, we developed and tested Phenobot 1.0, an auto-steered and self-propelled field-based high-throughput phenotyping platform for tall dense canopy crops, such as sorghum (Sorghum bicolor). Phenobot 1.0 was equipped with laterally positioned and vertically stacked stereo RGB cameras. Images collected from 307 diverse sorghum lines were reconstructed in 3D for feature extraction. User interfaces were developed, and multiple algorithms were evaluated for their accuracy in estimating plant height and stem diameter. Tested feature extraction methods included the following: (1) User-interactive Individual Plant Height Extraction (UsIn-PHe) based on dense stereo three-dimensional reconstruction; (2) Automatic Hedge-based Plant Height Extraction (Auto-PHe) based on dense stereo 3D reconstruction; (3) User-interactive Dense Stereo Matching Stem Diameter Extraction; and (4) User-interactive Image Patch Stereo Matching Stem Diameter Extraction (IPaS-Di). Comparative genome-wide association analysis and ground-truth validation demonstrated that both UsIn-PHe and Auto-PHe were accurate methods to estimate plant height, while Auto-PHe had the additional advantage of being a completely automated process. For stem diameter, IPaS-Di generated the most accurate estimates of this biomass-related architectural trait. In summary, our technology was proven robust to obtain ground-based high-throughput plant architecture parameters of sorghum, a tall and densely planted crop species. © 2017 American Society of Plant Biologists. All Rights Reserved.

  14. A High-Throughput, Field-Based Phenotyping Technology for Tall Biomass Crops1[OPEN

    Science.gov (United States)

    2017-01-01

    Recent advances in omics technologies have not been accompanied by equally efficient, cost-effective, and accurate phenotyping methods required to dissect the genetic architecture of complex traits. Even though high-throughput phenotyping platforms have been developed for controlled environments, field-based aerial and ground technologies have only been designed and deployed for short-stature crops. Therefore, we developed and tested Phenobot 1.0, an auto-steered and self-propelled field-based high-throughput phenotyping platform for tall dense canopy crops, such as sorghum (Sorghum bicolor). Phenobot 1.0 was equipped with laterally positioned and vertically stacked stereo RGB cameras. Images collected from 307 diverse sorghum lines were reconstructed in 3D for feature extraction. User interfaces were developed, and multiple algorithms were evaluated for their accuracy in estimating plant height and stem diameter. Tested feature extraction methods included the following: (1) User-interactive Individual Plant Height Extraction (UsIn-PHe) based on dense stereo three-dimensional reconstruction; (2) Automatic Hedge-based Plant Height Extraction (Auto-PHe) based on dense stereo 3D reconstruction; (3) User-interactive Dense Stereo Matching Stem Diameter Extraction; and (4) User-interactive Image Patch Stereo Matching Stem Diameter Extraction (IPaS-Di). Comparative genome-wide association analysis and ground-truth validation demonstrated that both UsIn-PHe and Auto-PHe were accurate methods to estimate plant height, while Auto-PHe had the additional advantage of being a completely automated process. For stem diameter, IPaS-Di generated the most accurate estimates of this biomass-related architectural trait. In summary, our technology was proven robust to obtain ground-based high-throughput plant architecture parameters of sorghum, a tall and densely planted crop species. PMID:28620124

  15. High-throughput sequencing analysis of the bacteria in the dust storm which passed over Canberra, Australia on 22-23 September 2009

    Science.gov (United States)

    Munday, Chris; De Deckker, Patrick; Tapper, Nigel; Allison, Gwen

    2014-05-01

    Following a prolonged drought in Australia in the first decade of the 21st century, several dust storms affected the heavily populated East coast of Australia. The largest such storm occurred on 22-23 September 2009 and had a front of an estimated 3000km. A 24hr average PM10 concentration of over 2,000μg/m3 was recorded in several locations and an hourly peak of over 15,000μg/m3 was recorded (Leys et al. 2011). Over two time periods duplicate aerosol samples were collected on 47mm diameter cellulose nitrate membranes at a location removed from anthropogenic influences. One set of samples was collected in the afternoon the dust event started and another was collected overnight. Additionally, overnight rainfall was collected in a sterile bottle.DNA was directly extracted one membrane from each time point for molecular cloning and high throughput sequencing, while the other was cultivated on Tryptic Soy Agar (TSA). High throughput sequencing was performed using the 454 Titanium platform. From the three samples, 19,945 curated sequences were obtained representing 942 OTUS, with the three samples approximately equal in number. Unclassified Rhizobiales and Stenotrophomonas were the most abundant groups which could be attributed names. A total of 942 OTUs were identified (cutoff = 0.03), and despite the temporal relation of the samples, only eleven were found in all three samples, indicating that the dust storm evolved in composition as it passed over the region. Approximately 800 and 500 CFU/m3 were found in the two cultivated samples, tenfold more than was collected from previous dust events (Lim et al, 2011). Identification of cultivars revealed a dominance of the gram positive Firmicutes phylum, while the clone library showed a more even distribution of taxa, with Actinobacteria the most common and Firmicutes comprising less than 10% of sequences. Collectively, the analyses indicate that the concentration of cultivable organisms during the dust storm dramatically

  16. Effectiveness and cost-effectiveness of antidepressants in primary care: a multiple treatment comparison meta-analysis and cost-effectiveness model.

    Directory of Open Access Journals (Sweden)

    Joakim Ramsberg

    Full Text Available OBJECTIVE: To determine effectiveness and cost-effectiveness over a one-year time horizon of pharmacological first line treatment in primary care for patients with moderate to severe depression. DESIGN: A multiple treatment comparison meta-analysis was employed to determine the relative efficacy in terms of remission of 10 antidepressants (citalopram, duloxetine escitalopram, fluoxetine, fluvoxamine mirtazapine, paroxetine, reboxetine, sertraline and venlafaxine. The estimated remission rates were then applied in a decision-analytic model in order to estimate costs and quality of life with different treatments at one year. DATA SOURCES: Meta-analyses of remission rates from randomised controlled trials, and cost and quality-of-life data from published sources. RESULTS: The most favourable pharmacological treatment in terms of remission was escitalopram with an 8- to 12-week probability of remission of 0.47. Despite a high acquisition cost, this clinical effectiveness translated into escitalopram being both more effective and having a lower total cost than all other comparators from a societal perspective. From a healthcare perspective, the cost per QALY of escitalopram was €3732 compared with venlafaxine. CONCLUSION: Of the investigated antidepressants, escitalopram has the highest probability of remission and is the most effective and cost-effective pharmacological treatment in a primary care setting, when evaluated over a one year time-horizon. Small differences in remission rates may be important when assessing costs and cost-effectiveness of antidepressants.

  17. The practical problems of applying cost-effectiveness analysis to joint finance programmes

    OpenAIRE

    Karen Gerard; Ken Wright

    1990-01-01

    Joint finance is money allocated by the Department of Health to NHS authorities to promote policies of inter-agency collaboration which prevent people being admitted to hospital or facilitate earlier discharge from hospital or save on NHS resources generally. Worries have been expressed that joint finance has not been used as effectively or efficiency as it might have been. This paper is concerned with the practical application of cost-effectiveness analysis to policies or schemes which typic...

  18. A High-Throughput Automated Microfluidic Platform for Calcium Imaging of Taste Sensing

    Directory of Open Access Journals (Sweden)

    Yi-Hsing Hsiao

    2016-07-01

    Full Text Available The human enteroendocrine L cell line NCI-H716, expressing taste receptors and taste signaling elements, constitutes a unique model for the studies of cellular responses to glucose, appetite regulation, gastrointestinal motility, and insulin secretion. Targeting these gut taste receptors may provide novel treatments for diabetes and obesity. However, NCI-H716 cells are cultured in suspension and tend to form multicellular aggregates, preventing high-throughput calcium imaging due to interferences caused by laborious immobilization and stimulus delivery procedures. Here, we have developed an automated microfluidic platform that is capable of trapping more than 500 single cells into microwells with a loading efficiency of 77% within two minutes, delivering multiple chemical stimuli and performing calcium imaging with enhanced spatial and temporal resolutions when compared to bath perfusion systems. Results revealed the presence of heterogeneity in cellular responses to the type, concentration, and order of applied sweet and bitter stimuli. Sucralose and denatonium benzoate elicited robust increases in the intracellular Ca2+ concentration. However, glucose evoked a rapid elevation of intracellular Ca2+ followed by reduced responses to subsequent glucose stimulation. Using Gymnema sylvestre as a blocking agent for the sweet taste receptor confirmed that different taste receptors were utilized for sweet and bitter tastes. This automated microfluidic platform is cost-effective, easy to fabricate and operate, and may be generally applicable for high-throughput and high-content single-cell analysis and drug screening.

  19. High-throughput templated multisegment synthesis of gold nanowires and nanorods.

    Science.gov (United States)

    Burdick, Jared; Alonas, Eric; Huang, Huang-Chiao; Rege, Kaushal; Wang, Joseph

    2009-02-11

    A cost-effective, high-throughput method for generating gold nanowires and/or nanorods based on a multisegment template electrodeposition approach is described. Using this method, multiple nanowires/nanorods can be generated from a single pore of alumina template membranes by alternately depositing segments of desirable (e.g., gold) and non-desirable metals (e.g., silver), followed by dissolution of the template and the non-desirable metal. Critical cost analysis indicates substantial savings in material requirements, processing times, and processing costs compared to the commonly used single-segment method. In addition to solid gold nanowires/nanorods, high yields of porous gold nanowires/nanorods are obtained by depositing alternate segments of gold-silver alloy and silver from the same gold-silver plating solution followed by selective dissolution of the silver from both segments. It is anticipated that this high-throughput method for synthesizing solid and porous gold nanowires and nanorods will accelerate their use in sensing, electronic, and biomedical applications.

  20. Analysis of the gut microbiota by high-throughput sequencing of the V5-V6 regions of the 16S rRNA gene in donkey.

    Science.gov (United States)

    Liu, Xinfeng; Fan, Hanlu; Ding, Xiangbin; Hong, Zhongshan; Nei, Yongwei; Liu, Zhongwei; Li, Guangpeng; Guo, Hong

    2014-05-01

    Considerable evidence suggests that the gut microbiota is complex in many mammals and gut bacteria communities are essential for maintaining gut homeostasis. To date the research on the gut microbiota of donkey is surprisingly scarce. Therefore, we performed high-throughput sequencing of the 16S rRNA genes V5-V6 hypervariable regions from gut fecal material to characterize the gut microbiota of healthy donkeys and compare the difference of gut microbiota between male and female donkeys. Sixty healthy donkeys (30 males and 30 females) were enrolled in the study, a total of 915,691 validated reads were obtained, and the bacteria found belonged to 21 phyla and 183 genera. At the phylum level, the bacterial community composition was similar for the male and female donkeys and predominated by Firmicutes (64 % males and 64 % females) and Bacteroidetes (23 % males and 21 % females), followed by Verrucomicrobia, Euryarchaeota, Spirochaetes, and Proteobacteria. At the genus level, Akkermansia was the most abundant genus (23 % males and 17 % females), followed by Sporobacter, Methanobrevibacter, and Treponema, detected in higher distribution proportion in males than in females. On the contrary, Acinetobacter and Lysinibacillus were lower in males than in females. In addition, six phyla and 15 genera were significantly different between the male and female donkeys for species abundance. These findings provide previously unknown information about the gut microbiota of donkeys and also provide a foundation for future investigations of gut bacterial factors that may influence the development and progression of gastrointestinal disease in donkey and other animals.

  1. Second generation of pseudotype-based serum neutralization assay for Nipah virus antibodies: sensitive and high-throughput analysis utilizing secreted alkaline phosphatase.

    Science.gov (United States)

    Kaku, Yoshihiro; Noguchi, Akira; Marsh, Glenn A; Barr, Jennifer A; Okutani, Akiko; Hotta, Kozue; Bazartseren, Boldbaatar; Fukushi, Shuetsu; Broder, Christopher C; Yamada, Akio; Inoue, Satoshi; Wang, Lin-Fa

    2012-01-01

    Nipah virus (NiV), Paramyxoviridae, Henipavirus, is classified as a biosafety level (BSL) 4 pathogen, along with the closely related Hendra virus (HeV). A novel serum neutralization test was developed for measuring NiV neutralizing antibodies under BSL2 conditions using a recombinant vesicular stomatitis virus (VSV) expressing secreted alkaline phosphatase (SEAP) and pseudotyped with NiV F/G proteins (VSV-NiV-SEAP). A unique characteristic of this novel assay is the ability to obtain neutralization titers by measuring SEAP activity in supernatant using a common ELISA plate reader. This confers a remarkable advantage over the first generation of NiV-pseudotypes expressing green fluorescent protein or luciferase, which require expensive and specific measuring equipment. Using panels of NiV- and HeV-specific sera from various species, the VSV-NiV-SEAP assay demonstrated neutralizing antibody status (positive/negative) consistent with that obtained by conventional live NiV test, and gave higher antibody titers than the latter. Additionally, when screening sixty-six fruit bat sera at one dilution, the VSV-NiV-SEAP assay produced identical results to the live NiV test and only required a very small amount (2μl) of sera. The results suggest that this novel VSV-NiV-SEAP assay is safe, useful for high-throughput screening of sera using an ELISA plate reader, and has high sensitivity and specificity.

  2. An Optimized High Throughput Clean-Up Method Using Mixed-Mode SPE Plate for the Analysis of Free Arachidonic Acid in Plasma by LC-MS/MS.

    Science.gov (United States)

    Wang, Wan; Qin, Suzi; Li, Linsen; Chen, Xiaohua; Wang, Qunjie; Wei, Junfu

    2015-01-01

    A high throughput sample preparation method was developed utilizing mixed-mode solid phase extraction (SPE) in 96-well plate format for the determination of free arachidonic acid in plasma by LC-MS/MS. Plasma was mixed with 3% aqueous ammonia and loaded into each well of 96-well plate. After washing with water and methanol sequentially, 3% of formic acid in acetonitrile was used to elute arachidonic acid. The collected fraction was injected onto a reversed phase column at 30°C with mobile phase of acetonitrile/water (70 : 30, v/v) and detected by LC-MS/MS coupled with electrospray ionization (ESI) in multiple reaction monitoring (MRM) mode. The calibration curve ranged from 10 to 2500 ng/mL with sufficient linearity (r (2) = 0.9999). The recoveries were in the range of 99.38% to 103.21% with RSD less than 6%. The limit of detection is 3 ng/mL.

  3. Detailed analysis and follow-up studies of a high-throughput screening for indoleamine 2,3-dioxygenase 1 (IDO1) inhibitors.

    Science.gov (United States)

    Röhrig, Ute F; Majjigapu, Somi Reddy; Chambon, Marc; Bron, Sylvian; Pilotte, Luc; Colau, Didier; Van den Eynde, Benoît J; Turcatti, Gerardo; Vogel, Pierre; Zoete, Vincent; Michielin, Olivier

    2014-09-12

    Indoleamine 2,3-dioxygenase 1 (IDO1) is a key regulator of immune responses and therefore an important therapeutic target for the treatment of diseases that involve pathological immune escape, such as cancer. Here, we describe a robust and sensitive high-throughput screen (HTS) for IDO1 inhibitors using the Prestwick Chemical Library of 1200 FDA-approved drugs and the Maybridge HitFinder Collection of 14,000 small molecules. Of the 60 hits selected for follow-up studies, 14 displayed IC50 values below 20 μM under the secondary assay conditions, and 4 showed an activity in cellular tests. In view of the high attrition rate we used both experimental and computational techniques to identify and to characterize compounds inhibiting IDO1 through unspecific inhibition mechanisms such as chemical reactivity, redox cycling, or aggregation. One specific IDO1 inhibitor scaffold, the imidazole antifungal agents, was chosen for rational structure-based lead optimization, which led to more soluble and smaller compounds with micromolar activity.

  4. The Microbiome and Metabolites in Fermented Pu-erh Tea as Revealed by High-Throughput Sequencing and Quantitative Multiplex Metabolite Analysis.

    Science.gov (United States)

    Zhang, Yongjie; Skaar, Ida; Sulyok, Michael; Liu, Xingzhong; Rao, Mingyong; Taylor, John W

    2016-01-01

    Pu-erh is a tea produced in Yunnan, China by microbial fermentation of fresh Camellia sinensis leaves by two processes, the traditional raw fermentation and the faster, ripened fermentation. We characterized fungal and bacterial communities in leaves and both Pu-erhs by high-throughput, rDNA-amplicon sequencing and we characterized the profile of bioactive extrolite mycotoxins in Pu-erh teas by quantitative liquid chromatography-tandem mass spectrometry. We identified 390 fungal and 629 bacterial OTUs from leaves and both Pu-erhs. Major findings are: 1) fungal diversity drops and bacterial diversity rises due to raw or ripened fermentation, 2) fungal and bacterial community composition changes significantly between fresh leaves and both raw and ripened Pu-erh, 3) aging causes significant changes in the microbial community of raw, but not ripened, Pu-erh, and, 4) ripened and well-aged raw Pu-erh have similar microbial communities that are distinct from those of young, raw Ph-erh tea. Twenty-five toxic metabolites, mainly of fungal origin, were detected, with patulin and asperglaucide dominating and at levels supporting the Chinese custom of discarding the first preparation of Pu-erh and using the wet tea to then brew a pot for consumption.

  5. A Cost-Effectiveness Analysis of the First Federally Funded Antismoking Campaign

    Science.gov (United States)

    Xu, Xin; Alexander, Robert L.; Simpson, Sean A.; Goates, Scott; Nonnemaker, James M.; Davis, Kevin C.; McAfee, Tim

    2015-01-01

    Background In 2012, CDC launched the first federally funded national mass media antismoking campaign. The Tips From Former Smokers (Tips) campaign resulted in a 12% relative increase in population-level quit attempts. Purpose Cost-effectiveness analysis was conducted in 2013 to evaluate Tips from a funding agency’s perspective. Methods Estimates of sustained cessations; premature deaths averted; undiscounted life years (LYs) saved; and quality-adjusted life years (QALYs) gained by Tips were estimated. Results Tips saved about 179,099 QALYs and prevented 17,109 premature deaths in the U.S. With the campaign cost of roughly $48 million, Tips spent approximately $480 per quitter, $2,819 per premature death averted, $393 per LY saved, and $268 per QALY gained. Conclusions Tips was not only successful at reducing smoking-attributable morbidity and mortality but also was a highly cost-effective mass media intervention. PMID:25498550

  6. Cost-effectiveness analysis of management strategies for obscure GI bleeding.

    Science.gov (United States)

    Gerson, Lauren; Kamal, Ahmad

    2008-11-01

    Of patients who are seen with GI hemorrhage, approximately 5% will have a small-bowel source. Management of these patients entails considerable expense. We performed a decision analysis to explore the optimal management strategy for obscure GI hemorrhage. We used a cost-effectiveness analysis to compare no therapy (reference arm) to 5 competing modalities for a 50-year-old patient with obscure overt bleeding: (1) push enteroscopy, (2) intraoperative enteroscopy, (3) angiography, (4) initial anterograde double-balloon enteroscopy (DBE) followed by retrograde DBE if the patient had ongoing bleeding, and (5) small-bowel capsule endoscopy (CE) followed by DBE guided by the CE findings. The model included prevalence rates for small-bowel lesions, sensitivity for each intervention, and the probability of spontaneous bleeding cessation. We examined total costs and quality-adjusted life years (QALY) over a 1-year time period. An initial DBE was the most cost-effective approach. The no-therapy arm cost $532 and was associated with 0.870 QALYs compared with $2407 and 0.956 QALYs for the DBE approach, which resulted in an incremental cost-effectiveness ratio of $20,833 per QALY gained. Compared to the DBE approach, an initial CE was more costly and less effective. The initial DBE arm resulted in an 86% bleeding cessation rate compared to 76% for the CE arm and 59% for the no-therapy arm. The model results were robust to a wide range of sensitivity analyses. The short time horizon of the model, because of the lack of long-term data about the natural history of rebleeding from small-intestinal lesions. An initial DBE is a cost-effective approach for patients with obscure bleeding. However, capsule-directed DBE may be associated with better long-term outcomes because of the potential for fewer complications and decreased utilization of endoscopic resources.

  7. Cost-Effectiveness Analysis of Heat and Moisture Exchangers in Mechanically Ventilated Critically Ill Patients

    Science.gov (United States)

    Menegueti, Mayra Goncalves; Auxiliadora-Martins, Maria; Nunes, Altacilio Aparecido

    2016-01-01

    Background Moisturizing, heating and filtering gases inspired via the mechanical ventilation (MV) circuits help to reduce the adverse effects of MV. However, there is still no consensus regarding whether these measures improve patient prognosis, shorten MV duration, decrease airway secretion and lower the incidence of ventilator associated pneumonia (VAP) and other complications. Objectives The aim of this study was to study the incremental cost-effectiveness ratio associated with the use of heat and moisture exchangers (HME) filter to prevent VAP compared with the heated humidifiers (HH) presently adopted by intensive care unit (ICU) services within the Brazilian Healthcare Unified System. Patients and Methods This study was a cost-effectiveness analysis (CEA) comparing HME and HH in preventing VAP (outcome) in mechanically ventilated adult patients admitted to an ICU of a public university hospital. Results The analysis considered a period of 12 months; MV duration of 11 and 12 days for patients in HH and HME groups, respectively and a daily cost of R$ 16.46 and R$ 13.42 for HH and HME, respectively. HME was more attractive; costs ranged from R$ 21,000.00 to R$ 22,000.00 and effectiveness was close to 0.71, compared with a cost of R$ 30,000.00 and effectiveness between 0.69 and 0.70 for HH. HME and HH differed significantly for incremental effectiveness. Even after an effectiveness gain of 1.5% in favor of HH, and despite the wide variation in the VAP rate, the HME effectiveness remained stable. The mean HME cost-effectiveness was lower than the mean HH cost-effectiveness, being the HME value close to R$ 44,000.00. Conclusions Our findings revealed that HH and HME differ very little regarding effectiveness, which makes interpretation of the results in the context of clinical practice difficult. Nonetheless, there is no doubt that HME is advantageous. This technology incurs lower direct cost. PMID:27843770

  8. Cost-effectiveness analysis of baclofen and chlordiazepoxide in uncomplicated alcohol-withdrawal syndrome

    Directory of Open Access Journals (Sweden)

    Vikram K Reddy

    2014-01-01

    Full Text Available Objectives: Benzodiazepines (BZDs are the first-line drugs in alcohol-withdrawal syndrome (AWS. Baclofen, a gamma-aminobutyric acid B (GABA B agonist, controls withdrawal symptoms without causing significant adverse effects. The objective of this study was to compare the cost-effectiveness of baclofen and chlordiazepoxide in the management of uncomplicated AWS. Materials and Methods : This was a randomized, open label, standard controlled, parallel group study of cost-effectiveness analysis (CEA of baclofen and chlordiazepoxide in 60 participants with uncomplicated AWS. Clinical efficacy was measured by the Clinical Institute Withdrawal Assessment for alcohol (CIWA-Ar scores. Lorazepam was used as supplement medication if withdrawal symptoms could not be controlled effectively by the study drugs alone. Both direct and indirect medical costs were considered and the CEA was analyzed in both patient′s perspective and third-party perspective. Results : The average cost-effectiveness ratio (ACER in patient′s perspective of baclofen and chlordiazepoxide was Rs. 5,308.61 and Rs. 2,951.95 per symptom-free day, respectively. The ACER in third-party perspective of baclofen and chlordiazepoxide was Rs. 895.01 and Rs. 476.29 per symptom-free day, respectively. Participants on chlordiazepoxide had more number of symptom-free days when compared with the baclofen group on analysis by Mann-Whitney test (U = 253.50, P = 0.03. Conclusion : Both study drugs provided relief of withdrawal symptoms. Chlordiazepoxide was more cost-effective than baclofen. Baclofen was relatively less effective and more expensive than chlordiazepoxide.

  9. Identification and analysis of red sea mangrove (Avicennia marina microRNAs by high-throughput sequencing and their association with stress responses.

    Directory of Open Access Journals (Sweden)

    Basel Khraiwesh

    Full Text Available Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration.

  10. High-throughput sequencing and degradome analysis reveal altered expression of miRNAs and their targets in a male-sterile cybrid pummelo (Citrus grandis).

    Science.gov (United States)

    Fang, Yan-Ni; Zheng, Bei-Bei; Wang, Lun; Yang, Wei; Wu, Xiao-Meng; Xu, Qiang; Guo, Wen-Wu

    2016-08-09

    G1 + HBP is a male sterile cybrid line with nuclear genome from Hirado Buntan pummelo (C. grandis Osbeck) (HBP) and mitochondrial genome from "Guoqing No.1" (G1, Satsuma mandarin), which provides a good opportunity to study male sterility and nuclear-cytoplasmic cross talk in citrus. High-throughput sRNA and degradome sequencing were applied to identify miRNAs and their targets in G1 + HBP and its fertile type HBP during reproductive development. A total of 184 known miRNAs, 22 novel miRNAs and 86 target genes were identified. Some of the targets are transcription factors involved in floral development, such as auxin response factors (ARFs), SQUAMOSA promoter binding protein box (SBP-box), MYB, basic region-leucine zipper (bZIP), APETALA2 (AP2) and transport inhibitor response 1 (TIR1). Eight target genes were confirmed to be sliced by corresponding miRNAs using 5' RACE technology. Based on the sequencing abundance, 42 differentially expressed miRNAs between sterile line G1 + HBP and fertile line HBP were identified. Differential expression of miRNAs and their target genes between two lines was validated by quantitative RT-PCR, and reciprocal expression patterns between some miRNAs and their targets were demonstrated. The regulatory mechanism of miR167a was investigated by yeast one-hybrid and dual-luciferase assays that one dehydrate responsive element binding (DREB) transcription factor binds to miR167a promoter and transcriptionally repress miR167 expression. Our study reveals the altered expression of miRNAs and their target genes in a male sterile line of pummelo and highlights that miRNA regulatory network may be involved in floral bud development and cytoplasmic male sterility in citrus.

  11. Simultaneous analysis of 22 volatile organic compounds in cigarette smoke using gas sampling bags for high-throughput solid-phase microextraction.

    Science.gov (United States)

    Sampson, Maureen M; Chambers, David M; Pazo, Daniel Y; Moliere, Fallon; Blount, Benjamin C; Watson, Clifford H

    2014-07-15

    Quantifying volatile organic compounds (VOCs) in cigarette smoke is necessary to establish smoke-related exposure estimates and evaluate emerging products and potential reduced-exposure products. In response to this need, we developed an automated, multi-VOC quantification method for machine-generated, mainstream cigarette smoke using solid-phase microextraction gas chromatography-mass spectrometry (SPME-GC-MS). This method was developed to simultaneously quantify a broad range of smoke VOCs (i.e., carbonyls and volatiles, which historically have been measured by separate assays) for large exposure assessment studies. Our approach collects and maintains vapor-phase smoke in a gas sampling bag, where it is homogenized with isotopically labeled analogue internal standards and sampled using gas-phase SPME. High throughput is achieved by SPME automation using a CTC Analytics platform and custom bag tray. This method has successfully quantified 22 structurally diverse VOCs (e.g., benzene and associated monoaromatics, aldehydes and ketones, furans, acrylonitrile, 1,3-butadiene, vinyl chloride, and nitromethane) in the microgram range in mainstream smoke from 1R5F and 3R4F research cigarettes smoked under ISO (Cambridge Filter or FTC) and Intense (Health Canada or Canadian Intense) conditions. Our results are comparable to previous studies with few exceptions. Method accuracy was evaluated with third-party reference samples (≤15% error). Short-term diffusion losses from the gas sampling bag were minimal, with a 10% decrease in absolute response after 24 h. For most analytes, research cigarette inter- and intrarun precisions were ≤20% relative standard deviation (RSD). This method provides an accurate and robust means to quantify VOCs in cigarette smoke spanning a range of yields that is sufficient to characterize smoke exposure estimates.

  12. Analysis of epitopes on dengue virus envelope protein recognized by monoclonal antibodies and polyclonal human sera by a high throughput assay.

    Directory of Open Access Journals (Sweden)

    Hong-En Lin

    2012-01-01

    Full Text Available BACKGROUND: The envelope (E protein of dengue virus (DENV is the major target of neutralizing antibodies and vaccine development. While previous studies on domain III or domain I/II alone have reported several epitopes of monoclonal antibodies (mAbs against DENV E protein, the possibility of interdomain epitopes and the relationship between epitopes and neutralizing potency remain largely unexplored. METHODOLOGY/PRINCIPAL FINDINGS: We developed a dot blot assay by using 67 alanine mutants of predicted surface-exposed E residues as a systematic approach to identify epitopes recognized by mAbs and polyclonal sera, and confirmed our findings using a capture-ELISA assay. Of the 12 mouse mAbs tested, three recognized a novel epitope involving residues (Q211, D215, P217 at the central interface of domain II, and three recognized residues at both domain III and the lateral ridge of domain II, suggesting a more frequent presence of interdomain epitopes than previously appreciated. Compared with mAbs generated by traditional protocols, the potent neutralizing mAbs generated by a new protocol recognized multiple residues in A strand or residues in C strand/CC' loop of DENV2 and DENV1, and multiple residues in BC loop and residues in DE loop, EF loop/F strand or G strand of DENV1. The predominant epitopes of anti-E antibodies in polyclonal sera were found to include both fusion loop and non-fusion residues in the same or adjacent monomer. CONCLUSIONS/SIGNIFICANCE: Our analyses have implications for epitope-specific diagnostics and epitope-based dengue vaccines. This high throughput method has tremendous application for mapping both intra and interdomain epitopes recognized by human mAbs and polyclonal sera, which would further our understanding of humoral immune responses to DENV at the epitope level.

  13. Identification and Analysis of Red Sea Mangrove (Avicennia marina) microRNAs by High-Throughput Sequencing and Their Association with Stress Responses

    KAUST Repository

    Khraiwesh, Basel

    2013-04-08

    Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt) are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration. © 2013 Khraiwesh et al.

  14. Hospital-centered violence intervention programs: a cost-effectiveness analysis.

    Science.gov (United States)

    Chong, Vincent E; Smith, Randi; Garcia, Arturo; Lee, Wayne S; Ashley, Linnea; Marks, Anne; Liu, Terrence H; Victorino, Gregory P

    2015-04-01

    Hospital-centered violence intervention programs (HVIPs) reduce violent injury recidivism. However, dedicated cost analyses of such programs have not yet been published. We hypothesized that the HVIP at our urban trauma center is a cost-effective means for reducing violent injury recidivism. We conducted a cost-utility analysis using a state-transition (Markov) decision model, comparing participation in our HVIP with standard risk reduction for patients injured because of firearm violence. Model inputs were derived from our trauma registry and published literature. The 1-year recidivism rate for participants in our HVIP was 2.5%, compared with 4% for those receiving standard risk reduction resources. Total per-person costs of each violence prevention arm were similar: $3,574 for our HVIP and $3,515 for standard referrals. The incremental cost effectiveness ratio for our HVIP was $2,941. Our HVIP is a cost-effective means of preventing recurrent episodes of violent injury in patients hurt by firearms. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. The value of hygiene promotion: cost-effectiveness analysis of interventions in developing countries.

    Science.gov (United States)

    Sijbesma, Christine; Christoffers, Trea

    2009-11-01

    Hygiene promotion can greatly improve the benefits of water and sanitation programmes in developing countries at relatively limited costs. There are, however, few studies with hard data on the costs and effectiveness of individual programmes and even fewer have compared the cost-effectiveness of different promotional approaches. This article argues that objectively measured reductions of key sanitation and hygiene risks are better than DALYs for evaluating hygiene and sanitation promotion programmes. It presents a framework for the cost-effectiveness analysis of such programmes, which is used to analyse six field programmes. At costs ranging from US dollar 1.05 to US dollar 1.74 per person per year in 1999 US dollar values, they achieved (almost) complete abandonment of open defecation and considerable improvements in keeping toilets free from faecal soiling, safe disposal of child faeces, and/or washing hands with soap after defecation, before eating and after cleaning children's bottoms. However, only two studies used a quasi-experimental design (before and after studies in the intervention and - matched - control area) and only two measured costs and the degree to which results were sustained after the programme had ended. If the promotion of good sanitation and hygiene is to receive the political and managerial support it deserves, every water, sanitation and/or hygiene programme should give data on inputs, costs, processes and effects over time. More and better research that reflects the here-presented model is also needed to compare the cost-effectiveness of different promotional approaches.

  16. Cost-effectiveness analysis of cataract surgery with intraocular lens implantation: extracapsular cataract extraction versus phacoemulsification

    Directory of Open Access Journals (Sweden)

    Mohd R.A. Manaf

    2007-03-01

    Full Text Available A randomized single blinded clinical trial to compare the cost-effectiveness of cataract surgery between extracapsular cataract extraction (ECCE and phacoemulsification (PEA was conducted at Hospital Universiti Kebangsaan Malaysia (HUKM from March 2000 until August 2001. The cost of a cataract surgery incurred by hospital, patients and households were calculated preoperatively, one week, two months (for both techniques and six months (for ECCE only. Effectiveness of cataract surgery was assessed using Visual Function 14 (VF-14, quality of life measurement specifically for vision. The cost analysis results from each 50 subjects of ECCE and PEA group showed that average cost for one ECCE after six months post-operation is USD 458 (± USD 72 and for PEA is USD 528 (± USD 125. VF-14 score showed a significant increased after a week, two months and six months post-operation compared to the score before operation for both techniques (p<0.001. However, there was no significant difference between them (p = 0.225. This study indicated that ECCE is more cost effective compared to PEA with cost per one unit increment of VF-14 score of USD 14 compared to USD 20 for PEA. (Med J Indones 2007; 16:25-31 Keywords: cataract, cost-effectiveness, extracapsular cataract extraction, phacoemulsification, visual function 14

  17. Cyclosporine versus tacrolimus: cost-effectiveness analysis for renal transplantation in Brazil

    Science.gov (United States)

    Guerra, Augusto Afonso; Silva, Grazielle Dias; Andrade, Eli Iola Gurgel; Cherchiglia, Mariângela Leal; Costa, Juliana de Oliveira; Almeida, Alessandra Maciel; Acurcio, Francisco de Assis

    2015-01-01

    OBJECTIVE To analyze the cost-effectiveness of treatment regimens with cyclosporine or tacrolimus, five years after renal transplantation. METHODS This cost-effectiveness analysis was based on historical cohort data obtained between 2000 and 2004 and involved 2,022 patients treated with cyclosporine or tacrolimus, matched 1:1 for gender, age, and type and year of transplantation. Graft survival and the direct costs of medical care obtained from the National Health System (SUS) databases were used as outcome results. RESULTS Most of the patients were women, with a mean age of 36.6 years. The most frequent diagnosis of chronic renal failure was glomerulonephritis/nephritis (27.7%). In five years, the tacrolimus group had an average life expectancy gain of 3.96 years at an annual cost of R$78,360.57 compared with the cyclosporine group with a gain of 4.05 years and an annual cost of R$61,350.44. CONCLUSIONS After matching, the study indicated better survival of patients treated with regimens using tacrolimus. However, regimens containing cyclosporine were more cost-effective. PMID:25741648

  18. Cyclosporine versus tacrolimus: cost-effectiveness analysis for renal transplantation in Brazil.

    Science.gov (United States)

    Guerra Júnior, Augusto Afonso; Silva, Grazielle Dias; Andrade, Eli Iola Gurgel; Cherchiglia, Mariângela Leal; Costa, Juliana de Oliveira; Almeida, Alessandra Maciel; Acurcio, Francisco de Assis

    2015-01-01

    OBJECTIVE To analyze the cost-effectiveness of treatment regimens with cyclosporine or tacrolimus, five years after renal transplantation. METHODS This cost-effectiveness analysis was based on historical cohort data obtained between 2000 and 2004 and involved 2,022 patients treated with cyclosporine or tacrolimus, matched 1:1 for gender, age, and type and year of transplantation. Graft survival and the direct costs of medical care obtained from the National Health System (SUS) databases were used as outcome results. RESULTS Most of the patients were women, with a mean age of 36.6 years. The most frequent diagnosis of chronic renal failure was glomerulonephritis/nephritis (27.7%). In five years, the tacrolimus group had an average life expectancy gain of 3.96 years at an annual cost of R$78,360.57 compared with the cyclosporine group with a gain of 4.05 years and an annual cost of R$61,350.44. CONCLUSIONS After matching, the study indicated better survival of patients treated with regimens using tacrolimus. Moreover, regimens containing cyclosporine were more cost-effective [corrected].

  19. [Cost-effectiveness analysis of universal screening for thyroid disease in pregnant women in Spain].

    Science.gov (United States)

    Donnay Candil, Sergio; Balsa Barro, José Antonio; Álvarez Hernández, Julia; Crespo Palomo, Carlos; Pérez-Alcántara, Ferrán; Polanco Sánchez, Carlos

    2015-01-01

    To assess the cost-effectiveness of universal screening for thyroid disease in pregnant women in Spain as compared to high risk screening and no screening. A decision-analytic model comparing the incremental cost per quality-adjusted life year (QALY) of universal screening versus high risk screening and versus no screening. was used for the pregnancy and postpartum period. Probabilities from randomized controlled trials were considered for adverse obstetrical outcomes. A Markov model was used to assess the lifetime period after the first postpartum year and account for development of overt hypothyroidism. The main assumptions in the model and use of resources were assessed by local clinical experts. The analysis considered direct healthcare costs only. Universal screening gained .011 QALYs over high risk screening and .014 QALYS over no screening. Total direct costs per patient were €5,786 for universal screening, €5,791 for high risk screening, and €5,781 for no screening. Universal screening was dominant compared to risk-based screening and a very cost-effective alternative as compared to no screening. Use of universal screening instead of high risk screening would result in €2,653,854 annual savings for the Spanish National Health System. Universal screening for thyroid disease in pregnant women in the first trimester is dominant in Spain as compared to risk-based screening, and is cost-effective as compared to no screening (incremental cost-effectiveness ratio of €374 per QALY). Moreover, it allows diagnosing and treating cases of clinical and subclinical hypothyroidism that may not be detected when only high-risk women are screened. Copyright © 2014 SEEN. Published by Elsevier España, S.L.U. All rights reserved.

  20. Cost-effectiveness analysis of the available strategies for diagnosing malaria in outpatient clinics in Zambia

    Directory of Open Access Journals (Sweden)

    Chanda Pascalina

    2009-04-01

    Full Text Available Abstract Background Malaria in Zambia accounts for about 4 million clinical cases and 8 000 deaths annually. Artemether-lumefantrine (ACT, a relatively expensive drug, is being used as first line treatment of uncomplicated malaria. However, diagnostic capacity in Zambia is low, leading to potentially avoidable wastage of drugs due to unnecessary anti malarial treatment. Methods A cost-effectiveness evaluation of the three current alternatives to malaria diagnosis (clinical, microscopy and Rapid Diagnostic Tests- RDT was conducted in 12 facilities from 4 districts in Zambia. The analysis was conducted along an observational study, thus reflecting practice in health facilities under routine conditions. Average and incremental cost effectiveness ratios were estimated from the providers' perspective. Effectiveness was measured in relation to malaria cases correctly diagnosed by each strategy. Results Average cost-effectiveness ratios show that RDTs were more efficient (US$ 6.5 than either microscopy (US$ 11.9 or clinical diagnosis (US$ 17.1 for malaria case correctly diagnosed. In relation to clinical diagnoses the incremental cost per case correctly diagnosed and treated was US$ 2.6 and US$ 9.6 for RDT and microscopy respectively. RDTs would be much cheaper to scale up than microscopy. The findings were robust to changes in assumptions and various parameters. Conclusion RDTs were the most cost effective method at correctly diagnosing malaria in primary health facilities in Zambia when compared to clinical and microscopy strategies. However, the treatment prescription practices of the health workers can impact on the potential that a diagnostic test has to lead to savings on antimalarials. The results of this study will serve to inform policy makers on which alternatives will be most efficient in reducing malaria misdiagnosis by taking into account both the costs and effects of each strategy.

  1. Environmental cost-effectiveness analysis in intertemporal natural resource policy: evaluation of selective fishing gear.

    Science.gov (United States)

    Kronbak, Lone Grønbæk; Vestergaard, Niels

    2013-12-15

    In most decision-making involving natural resources, the achievements of a given policy (e.g., improved ecosystem or biodiversity) are rather difficult to measure in monetary units. To address this problem, the current paper develops an environmental cost-effectiveness analysis (ECEA) to include intangible benefits in intertemporal natural resource problems. This approach can assist managers in prioritizing management actions as least cost solutions to achieve quantitative policy targets. The ECEA framework is applied to a selective gear policy case in Danish mixed trawl fisheries in Kattegat and Skagerrak. The empirical analysis demonstrates how a policy with large negative net benefits might be justified if the intangible benefits are included.

  2. High-throughput crystallization screening.

    Science.gov (United States)

    Skarina, Tatiana; Xu, Xiaohui; Evdokimova, Elena; Savchenko, Alexei

    2014-01-01

    Protein structure determination by X-ray crystallography is dependent on obtaining a single protein crystal suitable for diffraction data collection. Due to this requirement, protein crystallization represents a key step in protein structure determination. The conditions for protein crystallization have to be determined empirically for each protein, making this step also a bottleneck in the structure determination process. Typical protein crystallization practice involves parallel setup and monitoring of a considerable number of individual protein crystallization experiments (also called crystallization trials). In these trials the aliquots of purified protein are mixed with a range of solutions composed of a precipitating agent, buffer, and sometimes an additive that have been previously successful in prompting protein crystallization. The individual chemical conditions in which a particular protein shows signs of crystallization are used as a starting point for further crystallization experiments. The goal is optimizing the formation of individual protein crystals of sufficient size and quality to make them suitable for diffraction data collection. Thus the composition of the primary crystallization screen is critical for successful crystallization.Systematic analysis of crystallization experiments carried out on several hundred proteins as part of large-scale structural genomics efforts allowed the optimization of the protein crystallization protocol and identification of a minimal set of 96 crystallization solutions (the "TRAP" screen) that, in our experience, led to crystallization of the maximum number of proteins.

  3. Cost/effectiveness analysis of atorvastatin in patients with acute coronary syndromes

    Directory of Open Access Journals (Sweden)

    Simona Cammarota

    2010-06-01

    Full Text Available Introduction: recent clinical trials found that high-dose statin therapy, compared with conventional-dose statin therapy, reduces the risk of cardiovascula