WorldWideScience

Sample records for saturation throughput analysis

  1. Promoter analysis by saturation mutagenesis

    Directory of Open Access Journals (Sweden)

    Baliga Nitin

    2001-01-01

    Full Text Available Gene expression and regulation are mediated by DNA sequences, in most instances, directly upstream to the coding sequences by recruiting transcription factors, regulators, and a RNA polymerase in a spatially defined fashion. Few nucleotides within a promoter make contact with the bound proteins. The minimal set of nucleotides that can recruit a protein factor is called a cis-acting element. This article addresses a powerful mutagenesis strategy that can be employed to define cis-acting elements at a molecular level. Technical details including primer design, saturation mutagenesis, construction of promoter libraries, phenotypic analysis, data analysis, and interpretation are discussed.

  2. A high-throughput colorimetric assay for screening halohydrin dehalogenase saturation mutagenesis libraries.

    Science.gov (United States)

    Tang, Lixia; Li, Yang; Wang, Xiong

    2010-06-01

    Here we have reported a high throughput pH indicator-based assay to measure the activity of halohydrin dehalogenases (HheC). The assay relies upon the absorbance change at 560nm and the visual color change of phenol red in a weakly buffered system, due to the release of protons from the enzyme-catalyzed ring-closure reactions. The assay can be performed in a microplate format using whole cells, making the assay simple and robust. Thus, it is suitable for library screening. The assay has been further validated using two previously studied HheC variants, D80N and W249F, which exhibit 200-fold lower and 2-fold higher k(cat) values, respectively, toward 1,3-dichloro-2-propanol than the wild-type HheC. In addition, a saturation mutagenesis library of HheC was screened using the developed assay for its ability to efficiently catalyze the conversion of 1,3-dichloro-2-propanol. After screening of 500 colonies, one mutant W139C was identified and was further purified and characterized. Kinetic analysis indicates that the resulting mutant shows 2- and 5-fold improvement in k(cat) value toward 1,3-DCP and (R,S)-p-nitro-2-bromo-1-phenylethanol, respectively, although it exhibits higher K(m) values than the wild-type enzyme. The method described herein represents a useful tool given the need for the high throughput screening of halohydrin dehalogenase mutants. 2010 Elsevier B.V. All rights reserved.

  3. Throughput analysis of ALOHA with cooperative diversity

    OpenAIRE

    Göktürk, Sarper Muharrem; Gokturk, Sarper Muharrem; Erçetin, Özgür; Ercetin, Ozgur; Gürbüz, Özgür; Gurbuz, Ozgur

    2008-01-01

    Cooperative transmissions emulate multi-antenna systems and can improve the quality of signal reception. In this paper, we propose and analyze a cross layer random access scheme, C-ALOHA, that enables cooperative transmissions in the context of ALOHA system. Our analysis shows that over a fading channel C-ALOHA can improve the throughput by 30%, as compared to standard ALOHA protocol.

  4. THROUGHPUT ANALYSIS OF EXTENDED ARQ SCHEMES

    African Journals Online (AJOL)

    PUBLICATIONS1

    formation transmitted in digital communication systems. Such schemes ... Department of Electronics and Telecommunications Engineering,. University of Dar es ...... (2): 165-176. Kundaeli, H. N. (2013). Throughput-Delay. Analysis of the SR-ST-GBN ARQ Scheme. Mediterranean Journal of Electronics and. Communication.

  5. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  6. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Kezhu Hong

    2007-04-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  7. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Hong Kezhu

    2007-01-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  8. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  9. Analysis on Inductance and Torque of PMSM Considering Magnetic Saturation

    Science.gov (United States)

    Cao, Xiao-Hua; Wang, Xin; Wei, Heng

    2017-07-01

    This paper analyses the surface-mounted PMSM which controlled by Id=0 vector control based on Ansoft in which finite element simulation of 2D static magnetic field can be operated on, then calculating and analysing the data with MATLAB, and then operating on the analysis of the change law of torque and inductance under different load conditions, and then paying more attention on the impact of magnetic saturation to torque and inductance. With the analysis of magnetic saturation, this paper puts forward a scheme of control and design used by PMSM.

  10. Max-plus algebraic throughput analysis of synchronous dataflow graphs

    NARCIS (Netherlands)

    de Groote, Robert; Kuper, Jan; Broersma, Haitze J.; Smit, Gerardus Johannes Maria

    2012-01-01

    In this paper we present a novel approach to throughput analysis of synchronous dataflow (SDF) graphs. Our approach is based on describing the evolution of actor firing times as a linear time-invariant system in max-plus algebra. Experimental results indicate that our approach is faster than

  11. Microscopic analysis of saturable absorbers: Semiconductor saturable absorber mirrors versus graphene

    Energy Technology Data Exchange (ETDEWEB)

    Hader, J.; Moloney, J. V. [Nonlinear Control Strategies, Inc., 3542 N. Geronimo Ave., Tucson, Arizona 85705 (United States); College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 (United States); Yang, H.-J.; Scheller, M. [College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 (United States); Koch, S. W. [Department of Physics and Materials Sciences Center, Philipps Universität Marburg, Renthof 5, 35032 Marburg (Germany)

    2016-02-07

    Fully microscopic many-body calculations are used to study the influence of strong sub-picosecond pulses on the carrier distributions and corresponding optical response in saturable absorbers used for mode-locking—semiconductor (quantum well) saturable absorber mirrors (SESAMs) and single layer graphene based saturable absorber mirrors (GSAMs). Unlike in GSAMs, the saturation fluence and recovery time in SESAMs show a strong spectral dependence. While the saturation fluence in the SESAM is minimal at the excitonic bandgap, the optimal recovery time and least pulse distortion due to group delay dispersion are found for excitation higher in the first subband. For excitation near the SESAM bandgap, the saturation fluence is about one tenth of that in the GSAM. At energies above the bandgap, the fluences in both systems become similar. A strong dependence of the saturation fluence on the pulse width in both systems is caused by carrier relaxation during the pulse. The recovery time in graphene is found to be about two to four times faster than that in the SESAMs. The occurrence of negative differential transmission in graphene is shown to be caused by dopant related carriers. In SESAMs, a negative differential transmission is found when exciting below the excitonic resonance where excitation induced dephasing leads to an enhancement of the absorption. Comparisons of the simulation data to the experiment show a very good quantitative agreement.

  12. Microscopic analysis of saturable absorbers: Semiconductor saturable absorber mirrors versus graphene

    Science.gov (United States)

    Hader, J.; Yang, H.-J.; Scheller, M.; Moloney, J. V.; Koch, S. W.

    2016-02-01

    Fully microscopic many-body calculations are used to study the influence of strong sub-picosecond pulses on the carrier distributions and corresponding optical response in saturable absorbers used for mode-locking—semiconductor (quantum well) saturable absorber mirrors (SESAMs) and single layer graphene based saturable absorber mirrors (GSAMs). Unlike in GSAMs, the saturation fluence and recovery time in SESAMs show a strong spectral dependence. While the saturation fluence in the SESAM is minimal at the excitonic bandgap, the optimal recovery time and least pulse distortion due to group delay dispersion are found for excitation higher in the first subband. For excitation near the SESAM bandgap, the saturation fluence is about one tenth of that in the GSAM. At energies above the bandgap, the fluences in both systems become similar. A strong dependence of the saturation fluence on the pulse width in both systems is caused by carrier relaxation during the pulse. The recovery time in graphene is found to be about two to four times faster than that in the SESAMs. The occurrence of negative differential transmission in graphene is shown to be caused by dopant related carriers. In SESAMs, a negative differential transmission is found when exciting below the excitonic resonance where excitation induced dephasing leads to an enhancement of the absorption. Comparisons of the simulation data to the experiment show a very good quantitative agreement.

  13. Identifiability analysis of Prandtl Ishilinskii hysteresis model with saturation

    Science.gov (United States)

    Sjöström, MÅrten; Gulliksson, MÅrten

    2008-02-01

    A new class of Preisach operators based on play operators with an inverse in a closed form and allowing for saturation has recently been proposed. Its existence criteria and identification procedure were considered in earlier articles. The present paper analyses the identification procedure with respect to the sensitivity to underlying functions (i.e. intrinsic behaviour of the hysteretic system), to spline approximation, and to the least square error (LSE) estimation procedure. The analysis shows that model errors are significantly influenced by large derivatives of the underlying functions. Spline approximations have generally little effect on model errors. In particular, an upper bound of the relative parameter error due to measurement discrepancies has been derived for the LSE problem. The bound increases, the closer to saturation data are measured.

  14. Web-based visual analysis for high-throughput genomics.

    Science.gov (United States)

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput

  15. Throughput Analysis of an Adaptation Rule in the HARQ Environment

    Directory of Open Access Journals (Sweden)

    K. Kotuliakova

    2003-06-01

    Full Text Available In this paper we analyze the adaptation rule, which estimates thechannel state and switches between hybrid ARQ(automatic-repeat-request and pure ARQ. Convolutional code was chosenas FEC (forward-error-correction in hybrid ARQ part and go-back-N ARQscheme is used in both cases. The adaptation rule is based on countingACKs and NAKs and its throughput analysis is made.

  16. Analysis of a Heroin Epidemic Model with Saturated Treatment Function

    Directory of Open Access Journals (Sweden)

    Isaac Mwangi Wangari

    2017-01-01

    Full Text Available A mathematical model is developed that examines how heroin addiction spreads in society. The model is formulated to take into account the treatment of heroin users by incorporating a realistic functional form that “saturates” representing the limited availability of treatment. Bifurcation analysis reveals that the model has an intrinsic backward bifurcation whenever the saturation parameter is larger than a fixed threshold. We are particularly interested in studying the model’s global stability. In the absence of backward bifurcations, Lyapunov functions can often be found and used to prove global stability. However, in the presence of backward bifurcations, such Lyapunov functions may not exist or may be difficult to construct. We make use of the geometric approach to global stability to derive a condition that ensures that the system is globally asymptotically stable. Numerical simulations are also presented to give a more complete representation of the model dynamics. Sensitivity analysis performed by Latin hypercube sampling (LHS suggests that the effective contact rate in the population, the relapse rate of heroin users undergoing treatment, and the extent of saturation of heroin users are mechanisms fuelling heroin epidemic proliferation.

  17. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  18. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    Science.gov (United States)

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  19. Throughput Analysis Model for IEEE 802.11e EDCA with Multiple Access Categories

    Directory of Open Access Journals (Sweden)

    Y. Lee

    2013-08-01

    Full Text Available IEEE 802.11e standard has been specified to support differentiated quality of service (QoS, one of the critical issues on the conventional IEEE 802.11 wireless local area networks (WLANs. Enhanced Distributed Channel Access (EDCA is the fundamental and mandatory contention-based channel access method of IEEE 802.11e, and delivers traffic based on differentiated Access Categories (ACs. A general three dimensional Markov chain model of IEEE 802.11e EDCA for performance analysis is proposed in this paper. The analytical model considers multiple stations with an arbitrary number of different ACs. It also differentiates the contention window (CW sizes and the arbitration interframe spaces (AIFSs, and considers virtual collision mechanism. Based on the model, the saturation throughput of EDCA is derived, and the accuracy of the proposed model is validated via simulations.

  20. High throughput inclusion body sizing: Nano particle tracking analysis.

    Science.gov (United States)

    Reichelt, Wieland N; Kaineder, Andreas; Brillmann, Markus; Neutsch, Lukas; Taschauer, Alexander; Lohninger, Hans; Herwig, Christoph

    2017-06-01

    The expression of pharmaceutical relevant proteins in Escherichia coli frequently triggers inclusion body (IB) formation caused by protein aggregation. In the scientific literature, substantial effort has been devoted to the quantification of IB size. However, particle-based methods used up to this point to analyze the physical properties of representative numbers of IBs lack sensitivity and/or orthogonal verification. Using high pressure freezing and automated freeze substitution for transmission electron microscopy (TEM) the cytosolic inclusion body structure was preserved within the cells. TEM imaging in combination with manual grey scale image segmentation allowed the quantification of relative areas covered by the inclusion body within the cytosol. As a high throughput method nano particle tracking analysis (NTA) enables one to derive the diameter of inclusion bodies in cell homogenate based on a measurement of the Brownian motion. The NTA analysis of fixated (glutaraldehyde) and non-fixated IBs suggests that high pressure homogenization annihilates the native physiological shape of IBs. Nevertheless, the ratio of particle counts of non-fixated and fixated samples could potentially serve as factor for particle stickiness. In this contribution, we establish image segmentation of TEM pictures as an orthogonal method to size biologic particles in the cytosol of cells. More importantly, NTA has been established as a particle-based, fast and high throughput method (1000-3000 particles), thus constituting a much more accurate and representative analysis than currently available methods. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. A novel approach to deterministic performance analysis of guidance loops with saturation

    NARCIS (Netherlands)

    Weiss, M.; Bucco, D.

    2011-01-01

    Since saturation often plays an important role in limiting the performance of guidance loops, performance analysis of guidance loops with saturation has been a popular subject of investigation. Most work so far has concentrated on the effect of saturation on stochastic performance of guidance loops,

  2. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  3. Interactive Visual Analysis of High Throughput Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; Goodall, John R [ORNL; Maness, Christopher S [ORNL; Senter, James K [ORNL; Potok, Thomas E [ORNL

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  4. Pattern Analysis of Oxygen Saturation Variability in Healthy Individuals: Entropy of Pulse Oximetry Signals Carries Information about Mean Oxygen Saturation

    Directory of Open Access Journals (Sweden)

    Amar S. Bhogal

    2017-08-01

    Full Text Available Pulse oximetry is routinely used for monitoring patients' oxygen saturation levels with little regard to the variability of this physiological variable. There are few published studies on oxygen saturation variability (OSV, with none describing the variability and its pattern in a healthy adult population. The aim of this study was to characterize the pattern of OSV using several parameters; the regularity (sample entropy analysis, the self-similarity [detrended fluctuation analysis (DFA] and the complexity [multiscale entropy (MSE analysis]. Secondly, to determine if there were any changes that occur with age. The study population consisted of 36 individuals. The “young” population consisted of 20 individuals [Mean (±1 SD age = 21.0 (±1.36 years] and the “old” population consisted of 16 individuals [Mean (±1 SD age = 50.0 (±10.4 years]. Through DFA analysis, OSV was shown to exhibit fractal-like patterns. The sample entropy revealed the variability to be more regular than heart rate variability and respiratory rate variability. There was also a significant inverse correlation between mean oxygen saturation and sample entropy in healthy individuals. Additionally, the MSE analysis described a complex fluctuation pattern, which was reduced with age (p < 0.05. These findings suggest partial “uncoupling” of the cardio-respiratory control system that occurs with aging. Overall, this study has characterized OSV using pre-existing tools. We have showed that entropy analysis of pulse oximetry signals carries information about body oxygenation. This may have the potential to be used in clinical practice to detect differences in diseased patient subsets.

  5. An analysis of the saturation of a high gain FEL

    Energy Technology Data Exchange (ETDEWEB)

    Gluckstern, R.L.; Okamoto, Hiromi (Maryland Univ., College Park, MD (United States). Dept. of Physics); Krinsky, S. (Brookhaven National Lab., Upton, NY (United States))

    1992-12-01

    We study the saturated state of an untapered free electron laser in the Compton regime, arising after exponential amplification of an initial low level of radiation by an initially monoenergetic, unbunched electron beam. The saturated state of the FEL is described by oscillations about an equilibrium state. Using the two invariants of the motion, and certain assumptions motivated by computer simulations, we provide approximate analytic descriptions of the radiation field and electron distribution in the saturation regime. We first consider a one-dimensional approximation, and later extend our approach to treat an electron beam of finite radial extent. Of note is a result on the radiated power in the case of an electron beam with small radius.

  6. Functional approach to high-throughput plant growth analysis

    Science.gov (United States)

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  7. Throughput Analysis of Fading Sensor Networks with Regular and Random Topologies

    Directory of Open Access Journals (Sweden)

    Liu Xiaowen

    2005-01-01

    Full Text Available We present closed-form expressions of the average link throughput for sensor networks with a slotted ALOHA MAC protocol in Rayleigh fading channels. We compare networks with three regular topologies in terms of throughput, transmit efficiency, and transport capacity. In particular, for square lattice networks, we present a sensitivity analysis of the maximum throughput and the optimum transmit probability with respect to the signal-to-interference ratio threshold. For random networks with nodes distributed according to a two-dimensional Poisson point process, the average throughput is analytically characterized and numerically evaluated. It turns out that although regular networks have an only slightly higher average link throughput than random networks for the same link distance, regular topologies have a significant benefit when the end-to-end throughput in multihop connections is considered.

  8. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  9. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  10. Performance Analysis of Throughput at Bahir Dar University LAN ...

    African Journals Online (AJOL)

    Computer scientists and network users have discovered that standard TCP does not perform well in high bandwidth delay environments. As a model, the Local Area Network of Bahir Dar University Engineering Faculty was tested and reported. . In this paper, we explore the challenges of achieving high throughput over real ...

  11. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    Science.gov (United States)

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  12. Trade-Off Analysis in High-Throughput Materials Exploration.

    Science.gov (United States)

    Volety, Kalpana K; Huyberechts, Guido P J

    2017-03-13

    This Research Article presents a strategy to identify the optimum compositions in metal alloys with certain desired properties in a high-throughput screening environment, using a multiobjective optimization approach. In addition to the identification of the optimum compositions in a primary screening, the strategy also allows pointing to regions in the compositional space where further exploration in a secondary screening could be carried out. The strategy for the primary screening is a combination of two multiobjective optimization approaches namely Pareto optimality and desirability functions. The experimental data used in the present study have been collected from over 200 different compositions belonging to four different alloy systems. The metal alloys (comprising Fe, Ti, Al, Nb, Hf, Zr) are synthesized and screened using high-throughput technologies. The advantages of such a kind of approach compared to the limitations of the traditional and comparatively simpler approaches like ranking and calculating figures of merit are discussed.

  13. Computational analysis of high-throughput flow cytometry data.

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2012-08-01

    Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible.

  14. Computational analysis of high-throughput flow cytometry data

    Science.gov (United States)

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  15. Throughput Analysis for a High-Performance FPGA-Accelerated Real-Time Search Application

    Directory of Open Access Journals (Sweden)

    Wim Vanderbauwhede

    2012-01-01

    Full Text Available We propose an FPGA design for the relevancy computation part of a high-throughput real-time search application. The application matches terms in a stream of documents against a static profile, held in off-chip memory. We present a mathematical analysis of the throughput of the application and apply it to the problem of scaling the Bloom filter used to discard nonmatches.

  16. An image analysis toolbox for high-throughput C. elegans assays.

    Science.gov (United States)

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H; Riklin-Raviv, Tammy; Conery, Annie L; O'Rourke, Eyleen J; Sokolnicki, Katherine L; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M; Carpenter, Anne E

    2012-04-22

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available through the open-source CellProfiler project and enables objective scoring of whole-worm high-throughput image-based assays of C. elegans for the study of diverse biological pathways that are relevant to human disease.

  17. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis

    Directory of Open Access Journals (Sweden)

    Na Wen

    2016-07-01

    Full Text Available This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1 prototype demonstration of single-cell encapsulation in microfluidic droplets; (2 technical improvements of single-cell encapsulation in microfluidic droplets; (3 microfluidic droplets enabling single-cell proteomic analysis; (4 microfluidic droplets enabling single-cell genomic analysis; and (5 integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  18. Urban Saturated Power Load Analysis Based on a Novel Combined Forecasting Model

    Directory of Open Access Journals (Sweden)

    Huiru Zhao

    2015-03-01

    Full Text Available Analysis of urban saturated power loads is helpful to coordinate urban power grid construction and economic social development. There are two different kinds of forecasting models: the logistic curve model focuses on the growth law of the data itself, while the multi-dimensional forecasting model considers several influencing factors as the input variables. To improve forecasting performance, a novel combined forecasting model for saturated power load analysis was proposed in this paper, which combined the above two models. Meanwhile, the weights of these two models in the combined forecasting model were optimized by employing a fruit fly optimization algorithm. Using Hubei Province as the example, the effectiveness of the proposed combined forecasting model was verified, demonstrating a higher forecasting accuracy. The analysis result shows that the power load of Hubei Province will reach saturation in 2039, and the annual maximum power load will reach about 78,630 MW. The results obtained from this proposed hybrid urban saturated power load analysis model can serve as a reference for sustainable development for urban power grids, regional economies, and society at large.

  19. HTPheno: an image analysis pipeline for high-throughput plant phenotyping.

    Science.gov (United States)

    Hartmann, Anja; Czauderna, Tobias; Hoffmann, Roberto; Stein, Nils; Schreiber, Falk

    2011-05-12

    In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. This paper presents an image analysis pipeline (HTPheno) for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view) during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  20. HTPheno: An image analysis pipeline for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Stein Nils

    2011-05-01

    Full Text Available Abstract Background In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. Results This paper presents an image analysis pipeline (HTPheno for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. Conclusions HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  1. Meta-analysis of prospective cohort studies evaluating the association of saturated fat with cardiovascular disease.

    Science.gov (United States)

    Siri-Tarino, Patty W; Sun, Qi; Hu, Frank B; Krauss, Ronald M

    2010-03-01

    A reduction in dietary saturated fat has generally been thought to improve cardiovascular health. The objective of this meta-analysis was to summarize the evidence related to the association of dietary saturated fat with risk of coronary heart disease (CHD), stroke, and cardiovascular disease (CVD; CHD inclusive of stroke) in prospective epidemiologic studies. Twenty-one studies identified by searching MEDLINE and EMBASE databases and secondary referencing qualified for inclusion in this study. A random-effects model was used to derive composite relative risk estimates for CHD, stroke, and CVD. During 5-23 y of follow-up of 347,747 subjects, 11,006 developed CHD or stroke. Intake of saturated fat was not associated with an increased risk of CHD, stroke, or CVD. The pooled relative risk estimates that compared extreme quantiles of saturated fat intake were 1.07 (95% CI: 0.96, 1.19; P = 0.22) for CHD, 0.81 (95% CI: 0.62, 1.05; P = 0.11) for stroke, and 1.00 (95% CI: 0.89, 1.11; P = 0.95) for CVD. Consideration of age, sex, and study quality did not change the results. A meta-analysis of prospective epidemiologic studies showed that there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of CHD or CVD. More data are needed to elucidate whether CVD risks are likely to be influenced by the specific nutrients used to replace saturated fat.

  2. High-throughput analysis of the impact of antibiotics on the human intestinal microbiota composition

    NARCIS (Netherlands)

    Ladirat, S.E.; Schols, H.A.; Nauta, A.; Schoterman, M.H.C.; Keijser, B.J.F.; Montijn, R.C.; Gruppen, H.; Schuren, F.H.J.

    2013-01-01

    Antibiotic treatments can lead to a disruption of the human microbiota. In this in-vitro study, the impact of antibiotics on adult intestinal microbiota was monitored in a new high-throughput approach: a fermentation screening-platform was coupled with a phylogenetic microarray analysis

  3. ESSENTIALS: Software for Rapid Analysis of High Throughput Transposon Insertion Sequencing Data.

    NARCIS (Netherlands)

    Zomer, A.L.; Burghout, P.J.; Bootsma, H.J.; Hermans, P.W.M.; Hijum, S.A.F.T. van

    2012-01-01

    High-throughput analysis of genome-wide random transposon mutant libraries is a powerful tool for (conditional) essential gene discovery. Recently, several next-generation sequencing approaches, e.g. Tn-seq/INseq, HITS and TraDIS, have been developed that accurately map the site of transposon

  4. Improvement of water saturation shift referencing by sequence and analysis optimization to enhance chemical exchange saturation transfer imaging.

    Science.gov (United States)

    Müller-Lutz, Anja; Matuschke, Felix; Schleich, Christoph; Wickrath, Frithjof; Boos, Johannes; Schmitt, Benjamin; Wittsack, Hans-Jörg

    2016-07-01

    To optimize B0-field inhomogeneity correction for chemical exchange saturation transfer (CEST) imaging by investigating different water saturation shift referencing (WASSR) Z-spectrum shapes and different frequency correction techniques. WASSR Z-spectra were simulated for different B1-fields and pulse durations (PD). Two parameter settings were used for further simulations and experiments (WASSR1: B1=0.1 μT, PD=50ms; WASSR2: B1=0.3 μT, PD=40ms). Four frequency correction techniques were investigated: 1) MinW: Minimum of the spline-interpolated WASSR-spectrum; 2) MSCF: maximum symmetry center frequency algorithm; 3) PMSCF: further development of MSCF algorithm; 4) BFit: fit with Bloch equations. Performance of frequency correction was assessed with Monte-Carlo simulations and in-vivo MR examinations in the brain and intervertebral disks. Different shapes of WASSR-Z-spectra were obtained by changing B1 and PD including spectra with one (1-Peak) or two (2-Peak) minima. WASSR1 resulted in 1-Peak WASSR-spectrum, whereas WASSR2 resulted in 2-Peak WASSR-spectrum. Both Monte-Carlo simulations and in-vivo MR examinations revealed highest accuracy of field-inhomogeneity correction with WASSR1 combined with PMSCF or BFit. Using a WASSR sequence, which results in a Z-spectrum with a single absorption peak, in combination with advanced postprocessing algorithms enables improved B0-field inhomogeneity correction for CEST imaging. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  6. Throughput and Delay Analysis of HARQ with Code Combining over Double Rayleigh Fading Channels

    KAUST Repository

    Chelli, Ali

    2018-01-15

    This paper proposes the use of hybrid automatic repeat request (HARQ) with code combining (HARQ-CC) to offer reliable communications over double Rayleigh channels. The double Rayleigh fading channel is of particular interest to vehicle-to-vehicle communication systems as well as amplify-and-forward relaying and keyhole channels. This work studies the performance of HARQ-CC over double Rayleigh channels from an information theoretic perspective. Analytical approximations are derived for the $\\\\epsilon$-outage capacity, the average number of transmissions, and the throughput of HARQ-CC. Moreover, we evaluate the delay experienced by Poisson arriving packets for HARQ-CC. We provide analytical expressions for the average waiting time, the packets sojourn time, the average consumed power, and the energy efficiency. In our investigation, we take into account the impact of imperfect feedback on different performance metrics. Additionally, we explore the tradeoff between energy efficiency and the throughput. The proposed scheme is shown to maintain the outage probability below a specified threshold $\\\\epsilon$ which ensures the link reliability. Meanwhile, HARQ-CC adapts implicitly the transmission rate to the channel conditions such that the throughput is maximized. Our results demonstrate that HARQ-CC allows improving the achievable communication rate compared to fixed time diversity schemes. To maximize the throughput of HARQ-CC, the rate per HARQ round should be less than the rate required to meet the outage constraint. Our investigation of the performance of HARQ-CC over Rayleigh and double Rayleigh channels shows that double Rayleigh channels have a higher severity of fading and result in a larger degradation of the throughput. Our analysis reveals that HARQ with incremental redundancy (HARQ-IR) achieves a larger throughput compared to HARQ-CC, while HARQ-CC is simpler to implement, has a lower decoding

  7. Unique problems associated with seismic analysis of partially gas-saturated unconsolidated sediments

    Science.gov (United States)

    Lee, M.W.; Collett, T.S.

    2009-01-01

    Gas hydrate stability conditions restrict the occurrence of gas hydrate to unconsolidated and high water-content sediments at shallow depths. Because of these host sediments properties, seismic and well log data acquired for the detection of free gas and associated gas hydrate-bearing sediments often require nonconventional analysis. For example, a conventional method of identifying free gas using the compressional/shear-wave velocity (Vp/Vs) ratio at the logging frequency will not work, unless the free-gas saturations are more than about 40%. The P-wave velocity dispersion of partially gas-saturated sediments causes a problem in interpreting well log velocities and seismic data. Using the White, J.E. [1975. Computed seismic speeds and attenuation in rocks with partial gas saturation. Geophysics 40, 224-232] model for partially gas-saturated sediments, the difference between well log and seismic velocities can be reconciled. The inclusion of P-wave velocity dispersion in interpreting well log data is, therefore, essential to identify free gas and to tie surface seismic data to synthetic seismograms.

  8. Dietary saturated fat intake is inversely associated with bone density in humans: analysis of NHANES III.

    Science.gov (United States)

    Corwin, Rebecca L; Hartman, Terryl J; Maczuga, Steven A; Graubard, Barry I

    2006-01-01

    Mounting evidence indicates that the amount and type of fat in the diet can have important effects on bone health. Most of this evidence is derived from animal studies. Of the few human studies that have been conducted, relatively small numbers of subjects and/or primarily female subjects were included. The present study assessed the relation of dietary fat to hip bone mineral density (BMD) in men and women using NHANES III data (n = 14,850). Multivariate models using SAS-callable SUDAAN were used to adjust for the sampling scheme. Models were adjusted for age, sex, weight, height, race, total energy and calcium intakes, smoking, and weight-bearing exercise. Data from women were further adjusted for use of hormone replacement therapy. Including dietary protein, vitamin C, and beta-carotene in the model did not influence the outcome. Analysis of covariance was used to generate mean BMD by quintile of total and saturated fat intake for 4 sex/age groups. Saturated fat intake was negatively associated with BMD at several hip sites. The greatest effects were seen among men saturated fat intake (BMD, 95% CI: highest quintile: 0.922 g/cm2, 0.909-0.935; lowest quintile: 0.963 g/cm2, 95% CI: 0.950-0.976). These data indicate that BMD is negatively associated with saturated fat intake, and that men may be particularly vulnerable to these effects.

  9. Synchrotron radiation measurement of multiphase fluid saturations in porous media: Experimental technique and error analysis

    Science.gov (United States)

    Tuck, David M.; Bierck, Barnes R.; Jaffé, Peter R.

    1998-06-01

    Multiphase flow in porous media is an important research topic. In situ, nondestructive experimental methods for studying multiphase flow are important for improving our understanding and the theory. Rapid changes in fluid saturation, characteristic of immiscible displacement, are difficult to measure accurately using gamma rays due to practical restrictions on source strength. Our objective is to describe a synchrotron radiation technique for rapid, nondestructive saturation measurements of multiple fluids in porous media, and to present a precision and accuracy analysis of the technique. Synchrotron radiation provides a high intensity, inherently collimated photon beam of tunable energy which can yield accurate measurements of fluid saturation in just one second. Measurements were obtained with precision of ±0.01 or better for tetrachloroethylene (PCE) in a 2.5 cm thick glass-bead porous medium using a counting time of 1 s. The normal distribution was shown to provide acceptable confidence limits for PCE saturation changes. Sources of error include heat load on the monochromator, periodic movement of the source beam, and errors in stepping-motor positioning system. Hypodermic needles pushed into the medium to inject PCE changed porosity in a region approximately ±1 mm of the injection point. Improved mass balance between the known and measured PCE injection volumes was obtained when appropriate corrections were applied to calibration values near the injection point.

  10. High-throughput ion beam analysis at imec

    Science.gov (United States)

    Meersschaut, J.; Vandervorst, W.

    2017-09-01

    We describe the ion beam analysis activities at imec. Rutherford backscattering spectrometry and time of flight-energy (TOF-E) elastic recoil detection analysis are pursued to support the nano-electronics research and development. We outline the experimental set-up and we introduce a new data acquisition software platform. Finally, we illustrate the use of Rutherford backscattering spectrometry to map the thickness of a metallic thin film on a 300 mm Si wafer.

  11. DNA from buccal swabs suitable for high-throughput SNP multiplex analysis.

    Science.gov (United States)

    McMichael, Gai L; Gibson, Catherine S; O'Callaghan, Michael E; Goldwater, Paul N; Dekker, Gustaaf A; Haan, Eric A; MacLennan, Alastair H

    2009-12-01

    We sought a convenient and reliable method for collection of genetic material that is inexpensive and noninvasive and suitable for self-collection and mailing and a compatible, commercial DNA extraction protocol to meet quantitative and qualitative requirements for high-throughput single nucleotide polymorphism (SNP) multiplex analysis on an automated platform. Buccal swabs were collected from 34 individuals as part of a pilot study to test commercially available buccal swabs and DNA extraction kits. DNA was quantified on a spectrofluorometer with Picogreen dsDNA prior to testing the DNA integrity with predesigned SNP multiplex assays. Based on the pilot study results, the Catch-All swabs and Isohelix buccal DNA isolation kit were selected for our high-throughput application and extended to a further 1140 samples as part of a large cohort study. The average DNA yield in the pilot study (n=34) was 1.94 microg +/- 0.54 with a 94% genotyping pass rate. For the high-throughput application (n=1140), the average DNA yield was 2.44 microg +/- 1.74 with a >or=93% genotyping pass rate. The Catch-All buccal swabs are a convenient and cost-effective alternative to blood sampling. Combined with the Isohelix buccal DNA isolation kit, they provided DNA of sufficient quantity and quality for high-throughput SNP multiplex analysis.

  12. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  13. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  14. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  15. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  16. High-Throughput Mutational Analysis of a Twister Ribozyme.

    Science.gov (United States)

    Kobori, Shungo; Yokobayashi, Yohei

    2016-08-22

    Recent discoveries of new classes of self-cleaving ribozymes in diverse organisms have triggered renewed interest in the chemistry and biology of ribozymes. Functional analysis and engineering of ribozymes often involve performing biochemical assays on multiple ribozyme mutants. However, because each ribozyme mutant must be individually prepared and assayed, the number and variety of mutants that can be studied are severely limited. All of the single and double mutants of a twister ribozyme (a total of 10 296 mutants) were generated and assayed for their self-cleaving activity by exploiting deep sequencing to count the numbers of cleaved and uncleaved sequences for every mutant. Interestingly, we found that the ribozyme is highly robust against mutations such that 71 % and 30 % of all single and double mutants, respectively, retain detectable activity under the assay conditions. It was also observed that the structural elements that comprise the ribozyme exhibit distinct sensitivity to mutations. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  17. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules.

    Directory of Open Access Journals (Sweden)

    Antonino Ingargiola

    Full Text Available We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions.

  18. Numerical analysis of fast saturable absorber mode-locked Yb(3+) lasers under large modulation depth.

    Science.gov (United States)

    Tokurakawa, Masaki; Shirakawa, Akira

    2015-10-05

    Numerical analysis of fast saturable absorber mode-locked Yb(3+)-doped solid state lasers is reported. The analysis includes a special case in which the spectral bandwidth of the short pulse is larger than the fluorescence bandwidth of the gain material. The relationship between the available shortest pulse duration and modulation depth for a standard bulk and thin disk laser geometries with several gain materials are shown. The characteristic phenomena observed in our previous Kerr-lens mode-locked laser experiments were reproduced in the simulation.

  19. Computational and Statistical Methods for High-Throughput Mass Spectrometry-Based PTM Analysis.

    Science.gov (United States)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analysis allows the quantitative comparison of thousands of modified peptides over different conditions. However, the large and complex datasets produced pose multiple data interpretation challenges, ranging from spectral interpretation to statistical and multivariate analyses. Here, we present a typical workflow to interpret such data.

  20. A Concept for a Sensitive Micro Total Analysis System for High Throughput Fluorescence Imaging

    OpenAIRE

    Rabner, Arthur; Shacham, Yosi

    2006-01-01

    This paper discusses possible methods for on-chip fluorescent imaging for integrated bio-sensors. The integration of optical and electro-optical accessories, according to suggested methods, can improve the performance of fluorescence imaging. It can boost the signal to background ratio by a few orders of magnitudes in comparison to conventional discrete setups. The methods that are present in this paper are oriented towards building reproducible arrays for high-throughput micro total analysis...

  1. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data

    OpenAIRE

    Althammer, Sonja Daniela; González-Vallinas Rostes, Juan, 1983-; Ballaré, Cecilia Julia; Beato, Miguel; Eyras Jiménez, Eduardo

    2011-01-01

    Motivation: High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein?DNA and protein?RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. Results: We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or b...

  2. High-throughput microfluidic device for single cell analysis using multiple integrated soft lithographic pumps.

    Science.gov (United States)

    Patabadige, Damith E W; Mickleburgh, Tom; Ferris, Lorin; Brummer, Gage; Culbertson, Anne H; Culbertson, Christopher T

    2016-05-01

    The ability to accurately control fluid transport in microfluidic devices is key for developing high-throughput methods for single cell analysis. Making small, reproducible changes to flow rates, however, to optimize lysis and injection using pumps external to the microfluidic device are challenging and time-consuming. To improve the throughput and increase the number of cells analyzed, we have integrated previously reported micropumps into a microfluidic device that can increase the cell analysis rate to ∼1000 cells/h and operate for over an hour continuously. In order to increase the flow rates sufficiently to handle cells at a higher throughput, three sets of pumps were multiplexed. These pumps are simple, low-cost, durable, easy to fabricate, and biocompatible. They provide precise control of the flow rate up to 9.2 nL/s. These devices were used to automatically transport, lyse, and electrophoretically separate T-Lymphocyte cells loaded with Oregon green and 6-carboxyfluorescein. Peak overlap statistics predicted the number of fully resolved single-cell electropherograms seen. In addition, there was no change in the average fluorescent dye peak areas indicating that the cells remained intact and the dyes did not leak out of the cells over the 1 h analysis time. The cell lysate peak area distribution followed that expected of an asynchronous steady-state population of immortalized cells. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Current developments in high-throughput analysis for microalgae cellular contents.

    Science.gov (United States)

    Lee, Tsung-Hua; Chang, Jo-Shu; Wang, Hsiang-Yu

    2013-11-01

    Microalgae have emerged as one of the most promising feedstocks for biofuels and bio-based chemical production. However, due to the lack of effective tools enabling rapid and high-throughput analysis of the content of microalgae biomass, the efficiency of screening and identification of microalgae with desired functional components from the natural environment is usually quite low. Moreover, the real-time monitoring of the production of target components from microalgae is also difficult. Recently, research efforts focusing on overcoming this limitation have started. In this review, the recent development of high-throughput methods for analyzing microalgae cellular contents is summarized. The future prospects and impacts of these detection methods in microalgae-related processing and industries are also addressed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Throughput and delay analysis of IEEE 802.15.6-based CSMA/CA protocol.

    Science.gov (United States)

    Ullah, Sana; Chen, Min; Kwak, Kyung Sup

    2012-12-01

    The IEEE 802.15.6 is a new communication standard on Wireless Body Area Network (WBAN) that focuses on a variety of medical, Consumer Electronics (CE) and entertainment applications. In this paper, the throughput and delay performance of the IEEE 802.15.6 is presented. Numerical formulas are derived to determine the maximum throughput and minimum delay limits of the IEEE 802.15.6 for an ideal channel with no transmission errors. These limits are derived for different frequency bands and data rates. Our analysis is validated by extensive simulations using a custom C+ + simulator. Based on analytical and simulation results, useful conclusions are derived for network provisioning and packet size optimization for different applications.

  5. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  6. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  7. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    Science.gov (United States)

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  8. High-throughput glycosylation analysis of therapeutic immunoglobulin G by capillary gel electrophoresis using a DNA analyzer.

    NARCIS (Netherlands)

    Reusch, D.; Haberger, M.; Kailich, T.; Heidenreich, A.K.; Kampe, M.; Bulau, P.; Wuhrer, M.

    2014-01-01

    The Fc glycosylation of therapeutic antibodies is crucial for their effector functions and their behavior in pharmacokinetics and pharmacodynamics. To monitor the Fc glycosylation in bioprocess development and characterization, high-throughput techniques for glycosylation analysis are needed. Here,

  9. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    Science.gov (United States)

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time method simplicity, and FIA-MS offers high-throughput without compromising sensitivity, precision and accuracy as much as ambient MS techniques. Consequently, FIA-MS is increasingly becoming recognized as a suitable technique for applications where quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and

  10. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping.

    Science.gov (United States)

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-06-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays 'Fernandez') plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. © 2014 American Society of Plant Biologists. All Rights Reserved.

  11. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  12. Hydrogel Droplet Microfluidics for High-Throughput Single Molecule/Cell Analysis.

    Science.gov (United States)

    Zhu, Zhi; Yang, Chaoyong James

    2017-01-17

    Heterogeneity among individual molecules and cells has posed significant challenges to traditional bulk assays, due to the assumption of average behavior, which would lose important biological information in heterogeneity and result in a misleading interpretation. Single molecule/cell analysis has become an important and emerging field in biological and biomedical research for insights into heterogeneity between large populations at high resolution. Compared with the ensemble bulk method, single molecule/cell analysis explores the information on time trajectories, conformational states, and interactions of individual molecules/cells, all key factors in the study of chemical and biological reaction pathways. Various powerful techniques have been developed for single molecule/cell analysis, including flow cytometry, atomic force microscopy, optical and magnetic tweezers, single-molecule fluorescence spectroscopy, and so forth. However, some of them have the low-throughput issue that has to analyze single molecules/cells one by one. Flow cytometry is a widely used high-throughput technique for single cell analysis but lacks the ability for intercellular interaction study and local environment control. Droplet microfluidics becomes attractive for single molecule/cell manipulation because single molecules/cells can be individually encased in monodisperse microdroplets, allowing high-throughput analysis and manipulation with precise control of the local environment. Moreover, hydrogels, cross-linked polymer networks that swell in the presence of water, have been introduced into droplet microfluidic systems as hydrogel droplet microfluidics. By replacing an aqueous phase with a monomer or polymer solution, hydrogel droplets can be generated on microfluidic chips for encapsulation of single molecules/cells according to the Poisson distribution. The sol-gel transition property endows the hydrogel droplets with new functionalities and diversified applications in single

  13. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  14. Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines

    Science.gov (United States)

    Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.

    2017-01-01

    Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.

  15. Throughput-Delay Analysis of Random Linear Network Coding for Wireless Broadcasting

    CERN Document Server

    Swapna, B T; Shroff, Ness B

    2011-01-01

    In an unreliable single-hop broadcast network setting, we investigate the throughput and decoding-delay performance of random linear network coding as a function of the coding window size and the network size. Our model consists of a source transmitting packets of a single flow to a set of $n$ users over independent erasure channels. The source performs random linear network coding (RLNC) over $k$ (coding window size) packets and broadcasts them to the users. We note that the broadcast throughput of RLNC must vanish with increasing $n$, for any fixed $k.$ Hence, in contrast to other works in the literature, we investigate how the coding window size $k$ must scale for increasing $n$. Our analysis reveals that the coding window size of $\\Theta(\\ln(n))$ represents a phase transition rate, below which the throughput converges to zero, and above which it converges to the broadcast capacity. Further, we characterize the asymptotic distribution of decoding delay and provide approximate expressions for the mean and v...

  16. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  17. Quantitative dot blot analysis (QDB), a versatile high throughput immunoblot method.

    Science.gov (United States)

    Tian, Geng; Tang, Fangrong; Yang, Chunhua; Zhang, Wenfeng; Bergquist, Jonas; Wang, Bin; Mi, Jia; Zhang, Jiandi

    2017-08-29

    Lacking access to an affordable method of high throughput immunoblot analysis for daily use remains a big challenge for scientists worldwide. We proposed here Quantitative Dot Blot analysis (QDB) to meet this demand. With the defined linear range, QDB analysis fundamentally transforms traditional immunoblot method into a true quantitative assay. Its convenience in analyzing large number of samples also enables bench scientists to examine protein expression levels from multiple parameters. In addition, the small amount of sample lysates needed for analysis means significant saving in research sources and efforts. This method was evaluated at both cellular and tissue levels with unexpected observations otherwise would be hard to achieve using conventional immunoblot methods like Western blot analysis. Using QDB technique, we were able to observed an age-dependent significant alteration of CAPG protein expression level in TRAMP mice. We believe that the adoption of QDB analysis would have immediate impact on biological and biomedical research to provide much needed high-throughput information at protein level in this "Big Data" era.

  18. Theoretical analysis of saturation and limit cycles in short pulse FEL oscillators

    Energy Technology Data Exchange (ETDEWEB)

    Piovella, N.; Chaix, P.; Jaroszynski, D. [Commissariat a l`Energie Atomique, Bruyeres-le-Chatel (France)] [and others

    1995-12-31

    We derive a model for the non linear evolution of a short pulse oscillator from low signal up to saturation in the small gain regime. This system is controlled by only two independent parameters: cavity detuning and losses. Using a closure relation, this model reduces to a closed set of 5 non linear partial differential equations for the EM field and moments of the electron distribution. An analysis of the linearised system allows to define and calculate the eigenmodes characterising the small signal regime. An arbitrary solution of the complete nonlinear system can then be expanded in terms of these eigenmodes. This allows interpreting various observed nonlinear behaviours, including steady state saturation, limit cycles, and transition to chaos. The single mode approximation reduces to a Landau-Ginzburg equation. It allows to obtain gain, nonlinear frequency shift, and efficiency as functions of cavity detuning and cavity losses. A generalisation to two modes allows to obtain a simple description of the limit cycle behaviour, as a competition between these two modes. An analysis of the transitions to more complex dynamics is also given. Finally, the analytical results are compared to the experimental data from the FELIX experiment.

  19. Spatial analysis and retail saturation of the shopping centers in Czech Republic

    Directory of Open Access Journals (Sweden)

    Marek Záboj

    2009-01-01

    Full Text Available The paper deals with spatial analysis of the shopping centers according to individual regions in Czech Republic. The centers over 10 000 m2 were identified with their total retail space and gravitational model was utilized to determine the break point between two competing regions – the point at which a person residing in an intermediate community would be likely to travel to one region rather than the other. Next part is aimed to calculation of saturation indicators. First one is coefficient of shopping centers saturation – measured like portion of retail space in all shopping centers per ca­pi­ta in each region (the average in Czech Republic is 0,115 m2·cap−1.; the lowest ratio was determined in South-Bohemian region – 0,033 and next is Karlovy Vary – 0,035; Hradec Králové – 0,049 and Vysočina – 0,048 regions; the highest ratio was determined in Liberec region – 0,275 and next is Central-Bohemian including Prague – 0,253; South-Moravian – 0,182 and Plzen – 0,157 regions. The second one is index of retail saturation – the population of the region is multiplied by the monthly ex­pen­di­tu­re on the goods and services the retailer wants to sell in shopping centers and this is divided by the total retail space in all shopping centers in the given region (the average in Czech Republic is 77 094,77 CZK·(m2−1; the same rank was determined – the highest index in South-Bohemian region – 175 909 and the lowest index in Liberec region – 20 224. The main result is comparison of regions according to their shopping centers saturation and recommendation to possible investors, de­ve­lo­pers and retailers where is the best site to invest and build a new complex of retail stores according to given indicators of spatial analysis.

  20. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  1. Compositional analysis: a valid approach to analyze microbiome high-throughput sequencing data.

    Science.gov (United States)

    Gloor, Gregory B; Reid, Gregor

    2016-08-01

    A workshop held at the 2015 annual meeting of the Canadian Society of Microbiologists highlighted compositional data analysis methods and the importance of exploratory data analysis for the analysis of microbiome data sets generated by high-throughput DNA sequencing. A summary of the content of that workshop, a review of new methods of analysis, and information on the importance of careful analyses are presented herein. The workshop focussed on explaining the rationale behind the use of compositional data analysis, and a demonstration of these methods for the examination of 2 microbiome data sets. A clear understanding of bioinformatics methodologies and the type of data being analyzed is essential, given the growing number of studies uncovering the critical role of the microbiome in health and disease and the need to understand alterations to its composition and function following intervention with fecal transplant, probiotics, diet, and pharmaceutical agents.

  2. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  3. SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.

    Science.gov (United States)

    Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro

    2012-12-01

    Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.

  4. Requirements for reliable determination of binding affinity constants by saturation analysis approach.

    Science.gov (United States)

    Borgna, Jean-Louis

    2004-12-01

    Accurate calculation of the equilibrium association constant (K) and binding site concentration (N) related to a receptor (R)/ligand (L) interaction, via R saturation analysis, requires exact determination of the specifically bound L concentration (B(S)) and the unbound L concentration (U) at equilibrium. However, most binding determinations involve a procedure for separation of bound and unbound L. In such situations, it was previously shown that correct calculation of B(S) and U from binding data requires prior determination of alpha, i.e. the procedure parameter representing the proportion of equilibrium B(S) recovered after running the separation process, and of kn, i.e. the equilibrium nonspecific binding coefficient. For the simplest model of R/L interaction, the consequences of alpha neglect and/or kn neglect on determination of K and N, via R saturation analysis, are investigated. When alpha but not kn has been determined, B(S) can be accurately calculated, whereas U is overestimated by factor (kn + 1). Consequently the type (linear or hyperbolic) of theoretic curves obtained by usual representations (such as the Scatchard, the Lineweaver-Burk or the Michaelis-Menten plot) of the R/L binding is unchanged; these curves afford correct N and underestimation of K by factor (kn + 1). When alpha (alpha determined B(S) and U are underestimated and overestimated, respectively. Then erroneous representations of the R/L binding result (e.g. instead of regular straight line segments, Scatchard plot and Lineweaver-Burk plot involve convex-upward and convex-downward hyperbola portions, respectively, suggestive of positive cooperativity of L binding), which leads to incorrect N and K. Errors in N and K would depend on (i) the binding (K, N and kn) and method (alpha) parameters and (ii) the expressions used to calculate approximate B(S) and U values. Simulations involving variable alpha, KN and kn values indicate that: (1) the magnitude of error in N determination (mainly

  5. Source screening module for contaminant transport analysis through vadose and saturated zones.

    Science.gov (United States)

    Bedekar, Vivek; Neville, Christopher; Tonkin, Matthew

    2012-01-01

    At complex sites there may be many potential sources of contaminants within the vadose zone. Screening-level analyses are useful to identify which potential source areas should be the focus of detailed investigation and analysis. A source screening module (SSM) has been developed to support preliminary evaluation of the threat posed by vadose zone waste sites on groundwater quality. This tool implements analytical solutions to simulate contaminant transport through the unsaturated and saturated zones to predict time-varying concentrations at potential groundwater receptors. The SSM integrates several transport processes in a single simulation that is implemented within a user-friendly, Microsoft Excel™ - based interface. © 2012, The Author(s). Ground Water © 2012, National Ground Water Association.

  6. Throughput and Delay Performance Analysis of Packet Aggregation Scheme for PRMA

    DEFF Research Database (Denmark)

    Zhang, Qi; Iversen, Villy Bæk; Fitzek, Frank H.P.

    2008-01-01

    , the system throughput depends on the size of packets and the number of consecutive packets. From the statistics of existent wireless data networks using PRMA protocol, it shows that the system throughput is quite low because of the inconsecutive small packets. In order to improve the throughput, packet...

  7. Microfluidic cell microarray platform for high throughput analysis of particle-cell interactions.

    Science.gov (United States)

    Tong, Ziqiu; Rajeev, Gayathri; Guo, Keying; Ivask, Angela; McCormick, Scott; Lombi, Enzo; Priest, Craig; Voelcker, Nicolas H

    2018-03-02

    With the advances in nanotechnology, particles with various size, shape, surface chemistry and composition can be easily produced. Nano- and microparticles have been extensively explored in many industrial and clinical applications. Ensuring that the particles themselves are not possessing any toxic effects to the biological system is of paramount importance. This paper describes a proof of concept method in which a microfluidic system is used in conjunction with a cell microarray technique aiming to streamline the analysis of particle-cell interaction in a high throughput manner. Polymeric microparticles, with different particle surface functionalities, were firstly used to investigate the efficiency of particle-cell adhesion under dynamic flow. Silver nanoparticles (AgNPs,10 nm in diameter) perfused at different concentrations (0 to 20 μg/ml) in parallel streams over the cells in the microchannel exhibited higher toxicity compared to the static culture in the 96 well plate format. This developed microfluidic system can be easily scaled up to accommodate larger number of microchannels for high throughput analysis of potential toxicity of a wide range of particles in a single experiment.

  8. High-throughput SNP-genotyping analysis of the relationships among Ponto-Caspian sturgeon species

    Science.gov (United States)

    Rastorguev, Sergey M; Nedoluzhko, Artem V; Mazur, Alexander M; Gruzdeva, Natalia M; Volkov, Alexander A; Barmintseva, Anna E; Mugue, Nikolai S; Prokhortchouk, Egor B

    2013-01-01

    Abstract Legally certified sturgeon fisheries require population protection and conservation methods, including DNA tests to identify the source of valuable sturgeon roe. However, the available genetic data are insufficient to distinguish between different sturgeon populations, and are even unable to distinguish between some species. We performed high-throughput single-nucleotide polymorphism (SNP)-genotyping analysis on different populations of Russian (Acipenser gueldenstaedtii), Persian (A. persicus), and Siberian (A. baerii) sturgeon species from the Caspian Sea region (Volga and Ural Rivers), the Azov Sea, and two Siberian rivers. We found that Russian sturgeons from the Volga and Ural Rivers were essentially indistinguishable, but they differed from Russian sturgeons in the Azov Sea, and from Persian and Siberian sturgeons. We identified eight SNPs that were sufficient to distinguish these sturgeon populations with 80% confidence, and allowed the development of markers to distinguish sturgeon species. Finally, on the basis of our SNP data, we propose that the A. baerii-like mitochondrial DNA found in some Russian sturgeons from the Caspian Sea arose via an introgression event during the Pleistocene glaciation. In the present study, the high-throughput genotyping analysis of several sturgeon populations was performed. SNP markers for species identification were defined. The possible explanation of the baerii-like mitotype presence in some Russian sturgeons in the Caspian Sea was suggested. PMID:24567827

  9. Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis

    Directory of Open Access Journals (Sweden)

    Tu Jing

    2012-01-01

    Full Text Available Abstract Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS, an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc., 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.

  10. INSIDIA: A FIJI Macro Delivering High-Throughput and High-Content Spheroid Invasion Analysis.

    Science.gov (United States)

    Moriconi, Chiara; Palmieri, Valentina; Di Santo, Riccardo; Tornillo, Giusy; Papi, Massimiliano; Pilkington, Geoff; De Spirito, Marco; Gumbleton, Mark

    2017-10-01

    Time-series image capture of in vitro 3D spheroidal cancer models embedded within an extracellular matrix affords examination of spheroid growth and cancer cell invasion. However, a customizable, comprehensive and open source solution for the quantitative analysis of such spheroid images is lacking. Here, the authors describe INSIDIA (INvasion SpheroID ImageJ Analysis), an open-source macro implemented as a customizable software algorithm running on the FIJI platform, that enables high-throughput high-content quantitative analysis of spheroid images (both bright-field gray and fluorescent images) with the output of a range of parameters defining the spheroid "tumor" core and its invasive characteristics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins.

    Science.gov (United States)

    Schwämmle, Veit; Verano-Braga, Thiago; Roepstorff, Peter

    2015-11-03

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis, multivariate analysis and data interpretation. We furthermore discuss the potential of future developments that will help to gain deep insight into the PTM-ome and its biological role in cells. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. An exploratory data analysis method to reveal modular latent structures in high-throughput data

    Directory of Open Access Journals (Sweden)

    Yu Tianwei

    2010-08-01

    Full Text Available Abstract Background Modular structures are ubiquitous across various types of biological networks. The study of network modularity can help reveal regulatory mechanisms in systems biology, evolutionary biology and developmental biology. Identifying putative modular latent structures from high-throughput data using exploratory analysis can help better interpret the data and generate new hypotheses. Unsupervised learning methods designed for global dimension reduction or clustering fall short of identifying modules with factors acting in linear combinations. Results We present an exploratory data analysis method named MLSA (Modular Latent Structure Analysis to estimate modular latent structures, which can find co-regulative modules that involve non-coexpressive genes. Conclusions Through simulations and real-data analyses, we show that the method can recover modular latent structures effectively. In addition, the method also performed very well on data generated from sparse global latent factor models. The R code is available at http://userwww.service.emory.edu/~tyu8/MLSA/.

  14. STATISTICAL METHODS FOR THE ANALYSIS OF HIGH-THROUGHPUT METABOLOMICS DATA

    Directory of Open Access Journals (Sweden)

    Jörg Bartel

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  15. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  16. An exploratory data analysis method to reveal modular latent structures in high-throughput data.

    Science.gov (United States)

    Yu, Tianwei

    2010-08-27

    Modular structures are ubiquitous across various types of biological networks. The study of network modularity can help reveal regulatory mechanisms in systems biology, evolutionary biology and developmental biology. Identifying putative modular latent structures from high-throughput data using exploratory analysis can help better interpret the data and generate new hypotheses. Unsupervised learning methods designed for global dimension reduction or clustering fall short of identifying modules with factors acting in linear combinations. We present an exploratory data analysis method named MLSA (Modular Latent Structure Analysis) to estimate modular latent structures, which can find co-regulative modules that involve non-coexpressive genes. Through simulations and real-data analyses, we show that the method can recover modular latent structures effectively. In addition, the method also performed very well on data generated from sparse global latent factor models. The R code is available at http://userwww.service.emory.edu/~tyu8/MLSA/.

  17. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  18. Bayesian analysis of high-throughput quantitative measurement of protein-DNA interactions.

    Directory of Open Access Journals (Sweden)

    David D Pollock

    Full Text Available Transcriptional regulation depends upon the binding of transcription factor (TF proteins to DNA in a sequence-dependent manner. Although many experimental methods address the interaction between DNA and proteins, they generally do not comprehensively and accurately assess the full binding repertoire (the complete set of sequences that might be bound with at least moderate strength. Here, we develop and evaluate through simulation an experimental approach that allows simultaneous high-throughput quantitative analysis of TF binding affinity to thousands of potential DNA ligands. Tens of thousands of putative binding targets can be mixed with a TF, and both the pre-bound and bound target pools sequenced. A hierarchical Bayesian Markov chain Monte Carlo approach determines posterior estimates for the dissociation constants, sequence-specific binding energies, and free TF concentrations. A unique feature of our approach is that dissociation constants are jointly estimated from their inferred degree of binding and from a model of binding energetics, depending on how many sequence reads are available and the explanatory power of the energy model. Careful experimental design is necessary to obtain accurate results over a wide range of dissociation constants. This approach, which we call Simultaneous Ultra high-throughput Ligand Dissociation EXperiment (SULDEX, is theoretically capable of rapid and accurate elucidation of an entire TF-binding repertoire.

  19. Construction and analysis of high-density linkage map using high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Dongyuan Liu

    Full Text Available Linkage maps enable the study of important biological questions. The construction of high-density linkage maps appears more feasible since the advent of next-generation sequencing (NGS, which eases SNP discovery and high-throughput genotyping of large population. However, the marker number explosion and genotyping errors from NGS data challenge the computational efficiency and linkage map quality of linkage study methods. Here we report the HighMap method for constructing high-density linkage maps from NGS data. HighMap employs an iterative ordering and error correction strategy based on a k-nearest neighbor algorithm and a Monte Carlo multipoint maximum likelihood algorithm. Simulation study shows HighMap can create a linkage map with three times as many markers as ordering-only methods while offering more accurate marker orders and stable genetic distances. Using HighMap, we constructed a common carp linkage map with 10,004 markers. The singleton rate was less than one-ninth of that generated by JoinMap4.1. Its total map distance was 5,908 cM, consistent with reports on low-density maps. HighMap is an efficient method for constructing high-density, high-quality linkage maps from high-throughput population NGS data. It will facilitate genome assembling, comparative genomic analysis, and QTL studies. HighMap is available at http://highmap.biomarker.com.cn/.

  20. High throughput imaging and analysis for biological interpretation of agricultural plants and environmental interaction

    Science.gov (United States)

    Hong, Hyundae; Benac, Jasenka; Riggsbee, Daniel; Koutsky, Keith

    2014-03-01

    High throughput (HT) phenotyping of crops is essential to increase yield in environments deteriorated by climate change. The controlled environment of a greenhouse offers an ideal platform to study the genotype to phenotype linkages for crop screening. Advanced imaging technologies are used to study plants' responses to resource limitations such as water and nutrient deficiency. Advanced imaging technologies coupled with automation make HT phenotyping in the greenhouse not only feasible, but practical. Monsanto has a state of the art automated greenhouse (AGH) facility. Handling of the soil, pots water and nutrients are all completely automated. Images of the plants are acquired by multiple hyperspectral and broadband cameras. The hyperspectral cameras cover wavelengths from visible light through short wave infra-red (SWIR). Inhouse developed software analyzes the images to measure plant morphological and biochemical properties. We measure phenotypic metrics like plant area, height, and width as well as biomass. Hyperspectral imaging allows us to measure biochemcical metrics such as chlorophyll, anthocyanin, and foliar water content. The last 4 years of AGH operations on crops like corn, soybean, and cotton have demonstrated successful application of imaging and analysis technologies for high throughput plant phenotyping. Using HT phenotyping, scientists have been showing strong correlations to environmental conditions, such as water and nutrient deficits, as well as the ability to tease apart distinct differences in the genetic backgrounds of crops.

  1. High-throughput mouse phenotyping using non-rigid registration and robust principal component analysis

    Science.gov (United States)

    Xie, Zhongliu; Kitamoto, Asanobu; Tamura, Masaru; Shiroishi, Toshihiko; Gillies, Duncan

    2016-03-01

    Intensive international efforts are underway towards phenotyping the mouse genome, by knocking out each of its ≍25,000 genes one-by-one for comparative study. With vast amounts of data to analyze, the traditional method using time-consuming histological examination is clearly impractical, leading to an overwhelming demand for some high-throughput phenotyping framework, especially with the employment of biomedical image informatics to efficiently identify phenotypes concerning morphological abnormality. Existing work has either excessively relied on volumetric analytics which is insensitive to phenotypes associated with no severe volume variations, or tailored for specific defects and thus fails to serve a general phenotyping purpose. Furthermore, the prevailing requirement of an atlas for image segmentation in contrast to its limited availability further complicates the issue in practice. In this paper we propose a high-throughput general-purpose phenotyping framework that is able to efficiently perform batch-wise anomaly detection without prior knowledge of the phenotype and the need for atlas-based segmentation. Anomaly detection is centered on the combined use of group-wise non-rigid image registration and robust principal component analysis (RPCA) for feature extraction and decomposition.

  2. Multiplex mRNA assay using electrophoretic tags for high-throughput gene expression analysis.

    Science.gov (United States)

    Tian, Huan; Cao, Liching; Tan, Yuping; Williams, Stephen; Chen, Lili; Matray, Tracy; Chenna, Ahmed; Moore, Sean; Hernandez, Vincent; Xiao, Vivian; Tang, Mengxiang; Singh, Sharat

    2004-09-08

    We describe a novel multiplexing technology using a library of small fluorescent molecules, termed eTag molecules, to code and quantify mRNA targets. eTag molecules, which have the same fluorometric property, but distinct charge-to-mass ratios possess pre-defined electrophoretic characteristics and can be resolved using capillary electrophoresis. Coupled with primary Invader mRNA assay, eTag molecules were applied to simultaneously quantify up to 44 mRNA targets. This multiplexing approach was validated by examining a panel of inflammation responsive genes in human umbilical vein endothelial cells stimulated with inflammatory cytokine interleukin 1beta. The laser-induced fluorescence detection and electrokinetic sample injection process in capillary electrophoresis allows sensitive quantification of thousands of copies of mRNA molecules in a reaction. The assay is precise, as evaluated by measuring qualified Z' factor, a dimensionless and simple characteristic for applications in high-throughput screening using mRNA assays. Our data demonstrate the synergy between the multiplexing capability of eTag molecules by sensitive capillary electrophoresis detection and the isothermal linear amplification characteristics of the Invader assay. eTag multiplex mRNA assay presents a unique platform for sensitive, high sample throughput and multiplex gene expression analysis.

  3. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes- neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  4. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  5. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    Science.gov (United States)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  6. High-throughput FTIR-based bioprocess analysis of recombinant cyprosin production.

    Science.gov (United States)

    Sampaio, Pedro N; Sales, Kevin C; Rosa, Filipa O; Lopes, Marta B; Calado, Cecília R C

    2017-01-01

    To increase the knowledge of the recombinant cyprosin production process in Saccharomyces cerevisiae cultures, it is relevant to implement efficient bioprocess monitoring techniques. The present work focuses on the implementation of a mid-infrared (MIR) spectroscopy-based tool for monitoring the recombinant culture in a rapid, economic, and high-throughput (using a microplate system) mode. Multivariate data analysis on the MIR spectra of culture samples was conducted. Principal component analysis (PCA) enabled capturing the general metabolic status of the yeast cells, as replicated samples appear grouped together in the score plot and groups of culture samples according to the main growth phase can be clearly distinguished. The PCA-loading vectors also revealed spectral regions, and the corresponding chemical functional groups and biomolecules that mostly contributed for the cell biomolecular fingerprint associated with the culture growth phase. These data were corroborated by the analysis of the samples' second derivative spectra. Partial least square (PLS) regression models built based on the MIR spectra showed high predictive ability for estimating the bioprocess critical variables: biomass (R 2 = 0.99, RMSEP 2.8%); cyprosin activity (R 2 = 0.98, RMSEP 3.9%); glucose (R 2 = 0.93, RMSECV 7.2%); galactose (R 2 = 0.97, RMSEP 4.6%); ethanol (R 2 = 0.97, RMSEP 5.3%); and acetate (R 2 = 0.95, RMSEP 7.0%). In conclusion, high-throughput MIR spectroscopy and multivariate data analysis were effective in identifying the main growth phases and specific cyprosin production phases along the yeast culture as well as in quantifying the critical variables of the process. This knowledge will promote future process optimization and control the recombinant cyprosin bioprocess according to Quality by Design framework.

  7. Utility of lab-on-a-chip technology for high-throughput nucleic acid and protein analysis

    DEFF Research Database (Denmark)

    Hawtin, Paul; Hardern, Ian; Wittig, Rainer

    2005-01-01

    samples is used to stratify gene sets for disease discovery. Finally, the applicability of a high-throughput LoaC system for assessing protein purification is demonstrated. The improvements in workflow processes, speed of analysis, data accuracy and reproducibility, and automated data analysis...

  8. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  9. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  10. Resonant waveguide grating imagers for single cell analysis and high throughput screening

    Science.gov (United States)

    Fang, Ye

    2015-08-01

    Resonant waveguide grating (RWG) systems illuminate an array of diffractive nanograting waveguide structures in microtiter plate to establish evanescent wave for measuring tiny changes in local refractive index arising from the dynamic mass redistribution of living cells upon stimulation. Whole-plate RWG imager enables high-throughput profiling and screening of drugs. Microfluidics RWG imager not only manifests distinct receptor signaling waves, but also differentiates long-acting agonism and antagonism. Spatially resolved RWG imager allows for single cell analysis including receptor signaling heterogeneity and the invasion of cancer cells in a spheroidal structure through 3-dimensional extracellular matrix. High frequency RWG imager permits real-time detection of drug-induced cardiotoxicity. The wide coverage in target, pathway, assay, and cell phenotype has made RWG systems powerful tool in both basic research and early drug discovery process.

  11. High-throughput protein extraction and immunoblotting analysis in Saccharomyces cerevisiae.

    Science.gov (United States)

    Lorenz, Todd C; Anand, Vikram C; Payne, Gregory S

    2008-01-01

    A variety of Saccharomyces cerevisiae strain libraries allow for systematic analysis of strains bearing gene deletions, repressible genes, overexpressed genes, or modified genes on a genome-wide scale. Here we introduce a method for culturing yeast strains in 96-well format to achieve log-phase growth and a high-throughput technique for generating whole-cell protein extracts from these cultures using sodium dodecyl sulfate and heat lysis. We subsequently describe a procedure to analyze these whole-cell extracts by immunoblotting for alkaline phosphatase and carboxypeptidase yscS to identify strains with defects in protein transport pathways or protein glycosylation. These methods should be readily adaptable to many different areas of interest.

  12. Analysis of data throughput in communication between PLCs and HMI/SCADA systems

    Science.gov (United States)

    Mikolajek, Martin; Koziorek, Jiri

    2016-09-01

    This paper is focused on Analysis of data throughout in communication between PLCs and HMI/SCADA systems. The first part of paper discusses basic problematic communication between PLC and HMI systems. Next part is about specific types of communications PLC - HMI requests. For those cases paper is talking about response and data throughput1-3 . Subsequent section of this article contains practical parts with various data exchanges between PLC Siemens and HMI. The possibilities of communication that are described in this article are focused on using OPC server for visualization software, custom HMI system and own application created by using .NET with Technology. The last part of this article contains some communication solutions.

  13. Improved structure, function and compatibility for CellProfiler: modular high-throughput image analysis software.

    Science.gov (United States)

    Kamentsky, Lee; Jones, Thouis R; Fraser, Adam; Bray, Mark-Anthony; Logan, David J; Madden, Katherine L; Ljosa, Vebjorn; Rueden, Curtis; Eliceiri, Kevin W; Carpenter, Anne E

    2011-04-15

    There is a strong and growing need in the biology research community for accurate, automated image analysis. Here, we describe CellProfiler 2.0, which has been engineered to meet the needs of its growing user base. It is more robust and user friendly, with new algorithms and features to facilitate high-throughput work. ImageJ plugins can now be run within a CellProfiler pipeline. CellProfiler 2.0 is free and open source, available at http://www.cellprofiler.org under the GPL v. 2 license. It is available as a packaged application for Macintosh OS X and Microsoft Windows and can be compiled for Linux. anne@broadinstitute.org Supplementary data are available at Bioinformatics online.

  14. [Comprehensive analysis of up-conversion luminescence saturation phenomena of ErYb:oxyfluoride vitroceramics].

    Science.gov (United States)

    Chen, Xiao-bo; Song, Zeng-fu

    2005-02-01

    The saturation phenomenon of the up-conversion luminescence of erbium ytterbium co-doped oxyfluoride vitroceramics (ErYb:FOV), when excited by a 966 nm diode-laser, was investigated comprehensively in the present article. A new kind of "characteristic saturation phenomenon", which results from energy diffusion, was found, i. e. the slope of logI-logP curve, the double logarithmic variation of up-conversion luminescence intensity I upon laser power P, is increased evidently toward regular multi-photon relation with the increase of laser facula. The "typical saturation phenomenon" resulting from ground state population's exhaustion has huge influence as well, which causes these logI-logP curves to bend gradually with the increase in laser power. Interestingly, this "typical saturation phenomenon" can be decreased obviously and even vanishes when the pumping laser power density is decreased enough.

  15. Fractal analysis of fracture increasing spontaneous imbibition in porous media with gas-saturated

    KAUST Repository

    Cai, Jianchao

    2013-08-01

    Spontaneous imbibition (SI) of wetting liquid into matrix blocks due to capillary pressure is regarded as an important recovery mechanism in low permeability fractured reservoir. In this paper, an analytical model is proposed for characterizing SI horizontally from a single plane fracture into gas-saturated matrix blocks. The presented model is based on the fractal character of pores in porous matrix, with gravity force included in the entire imbibition process. The accumulated mass of wetting liquid imbibed into matrix blocks is related to a number of factors such as contact area, pore fractal dimension, tortuosity, maximum pore size, porosity, liquid density and viscosity, surface tension, contact angle, as well as height and tilt angle of the fracture. The mechanism of fracture-enhanced SI is analyzed accordingly. Because of the effect of fracture, the gravity force is positive to imbibition process. Additionally, the farther away from the fracture top of the pore, the more influential the hydrostatic pressure is upon the imbibition action. The presented fractal analysis of horizontal spontaneous imbibition from a single fracture could also shed light on the scaling study of the mass transfer function between matrix and fracture system of fractured reservoirs. © 2013 World Scientific Publishing Company.

  16. Linear Stability Analysis of Penetrative Convection via Internal Heating in a Ferrofluid Saturated Porous Layer

    Directory of Open Access Journals (Sweden)

    Amit Mahajan

    2017-05-01

    Full Text Available Penetrative convection due to purely internal heating in a horizontal ferrofluid-saturated porous layer is examined by performing linear stability analysis. Four different types of heat supply functions are considered. The Darcy model is used to incorporate the effect of the porous medium. Numerical solutions are obtained by using the Chebyshev pseudospectral method, and the results are discussed for all three boundary conditions: when both boundaries are impermeable and conducting; when both boundaries are conducting with lower boundary impermeable and free upper boundary; and when both boundaries are impermeable with lower boundary conducting and upper with constant heat flux. The effect of the Langevin parameter, width of ferrofluid layer, permeability parameter, and nonlinearity of the fluid magnetization has been observed at the onset of penetrative convection for water- and ester-based ferrofluids. It is seen that the Langevin parameter, width of ferrofluid layer, and permeability parameter have stabilizing effects on the onset of convection, while the nonlinearity of the fluid magnetization advances the onset of convection.

  17. Analysis of Familial Tendencies in Transferrin Saturation in a Korean Population.

    Science.gov (United States)

    Oh, Sung-Hee; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2015-10-01

    Despite the high transferrin saturation (TS) level in Koreans, the p.Cys282Tyr and p.His63Asp mutations are markedly less frequent than in Caucasians. We aimed to determine TS levels and their familial tendencies in a Korean population using nationwide data from the Fifth Korea National Health and Nutrition Examination Survey (KNHANES V-1 2010). A total of 4904 subjects without a history of hepatitis B and C virus infection, or liver cirrhosis, and who were negative for anemia and hepatitis B antigen were enrolled. A familial tendency analysis was performed in 260 families. Parents were grouped into four quartiles based on their TS levels. Offspring were categorized according to the mean parental TS four quartile scores (1.0, 1.5, 2.0, 2.5, 3.0, 3.5, and 4.0). A familial tendency was evaluated by comparing the mean TS of offspring in seven parental groups. The mean TS was 39.3 ± 15.6% for Korean males and 33.2 ± 12.9% for Korean females, and both were significantly higher than those of Caucasians reported in the HEIRS study (30.6 ± 11.0% for male, 25.6 ± 10.6% for female, P Koreans beyond the p.Cys282Tyr and p.His63Asp mutations commonly identified in Caucasians.

  18. Saturated Switching Systems

    CERN Document Server

    Benzaouia, Abdellah

    2012-01-01

    Saturated Switching Systems treats the problem of actuator saturation, inherent in all dynamical systems by using two approaches: positive invariance in which the controller is designed to work within a region of non-saturating linear behaviour; and saturation technique which allows saturation but guarantees asymptotic stability. The results obtained are extended from the linear systems in which they were first developed to switching systems with uncertainties, 2D switching systems, switching systems with Markovian jumping and switching systems of the Takagi-Sugeno type. The text represents a thoroughly referenced distillation of results obtained in this field during the last decade. The selected tool for analysis and design of stabilizing controllers is based on multiple Lyapunov functions and linear matrix inequalities. All the results are illustrated with numerical examples and figures many of them being modelled using MATLAB®. Saturated Switching Systems will be of interest to academic researchers in con...

  19. pep2pro: the high-throughput proteomics data processing, analysis and visualization tool

    Directory of Open Access Journals (Sweden)

    Matthias eHirsch-Hoffmann

    2012-06-01

    Full Text Available The pep2pro database was built to support effective high-throughput proteome data analysis. Its database schema allows the coherent integration of search results from different database-dependent search algorithms and filtering of the data including control for unambiguous assignment of peptides to proteins. The capacity of the pep2pro database has been exploited in data analysis of various Arabidopsis proteome datasets. The diversity of the datasets and the associated scientific questions required thorough querying of the data. This was supported by the relational format structure of the data that links all information on the sample, spectrum, search database and algorithm to peptide and protein identifications and their post-translational modifications. After publication of datasets they are made available on the pep2pro website at www.pep2pro.ethz.ch. Further, the pep2pro data analysis pipeline also handles data export do the PRIDE database (http://www.ebi.ac.uk/pride and data retrieval by the MASCP Gator (http://gator.masc-proteomics.org/. The utility of pep2pro will continue to be used for analysis of additional datasets and as a data warehouse. The capacity of the pep2pro database for proteome data analysis has now also been made publicly available through the release of pep2pro4all, which consists of a database schema and a script that will populate the database with mass spectrometry data provided in mzIdentML format.

  20. pep2pro: the high-throughput proteomics data processing, analysis, and visualization tool.

    Science.gov (United States)

    Hirsch-Hoffmann, Matthias; Gruissem, Wilhelm; Baerenfaller, Katja

    2012-01-01

    The pep2pro database was built to support effective high-throughput proteome data analysis. Its database schema allows the coherent integration of search results from different database-dependent search algorithms and filtering of the data including control for unambiguous assignment of peptides to proteins. The capacity of the pep2pro database has been exploited in data analysis of various Arabidopsis proteome datasets. The diversity of the datasets and the associated scientific questions required thorough querying of the data. This was supported by the relational format structure of the data that links all information on the sample, spectrum, search database, and algorithm to peptide and protein identifications and their post-translational modifications. After publication of datasets they are made available on the pep2pro website at www.pep2pro.ethz.ch. Further, the pep2pro data analysis pipeline also handles data export do the PRIDE database (http://www.ebi.ac.uk/pride) and data retrieval by the MASCP Gator (http://gator.masc-proteomics.org/). The utility of pep2pro will continue to be used for analysis of additional datasets and as a data warehouse. The capacity of the pep2pro database for proteome data analysis has now also been made publicly available through the release of pep2pro4all, which consists of a database schema and a script that will populate the database with mass spectrometry data provided in mzIdentML format.

  1. Analysis of Contaminant Transport through the Vadose and Saturated Zones for Source Screening

    Science.gov (United States)

    Bedekar, V.; Neville, C. J.; Tonkin, M. J.

    2010-12-01

    At complex sites there may be many potential source areas. Screening level analyses are useful to identify which of the source areas should be the focus of detailed investigation and analysis. A screening tool has been developed to evaluate the threat posed by waste sites on groundwater quality. This tool implements analytical solutions to simulate contaminant transport through the vadose and saturated zones and predict time-varying concentrations at potential groundwater receptors. The screening tool is developed within a user friendly, Microsoft ExcelTM based interface; however, care has been taken to implement rigorous solutions. The screening tool considers the following mechanisms: (a) Partitioning of soil contamination in to an equivalent dissolved concentration. For a time-invariant source, the solution is generalized from [3] for sorption and decay. For a time-varying source, the solution represents a special, degenerate, case of a solution implemented in ATRANS [2]; (b) One-dimensional (1D) transport of the dissolved contamination through the vadose zone considering 1D dispersion, equilibrium sorption, and first order transformation reactions. Steady state infiltration and moisture content are assumed; (c) Blending (mixing) of ambient water quality in the saturated zone with the contaminated water leaching from the vadose zone; and (d) Three-dimensional (3D) transport through the saturated zone using the formulation provided in [2], considering advection, dispersion, sorption, and first-order transformation reactions. The solution is derived using integral transform methods, following approaches adopted in [1] and [4]. Independent verification showed that the analytical techniques implemented in this study generate solutions that closely approximate those obtained using sophisticated numerical approaches, with a systematic over-estimate of the likely impact to groundwater that (predictably) stems from the use of a 1D approximation in the vadose zone. As a

  2. Meta-analysis of prospective cohort studies evaluating the association of saturated fat with cardiovascular disease12345

    Science.gov (United States)

    Siri-Tarino, Patty W; Sun, Qi; Hu, Frank B

    2010-01-01

    Background: A reduction in dietary saturated fat has generally been thought to improve cardiovascular health. Objective: The objective of this meta-analysis was to summarize the evidence related to the association of dietary saturated fat with risk of coronary heart disease (CHD), stroke, and cardiovascular disease (CVD; CHD inclusive of stroke) in prospective epidemiologic studies. Design: Twenty-one studies identified by searching MEDLINE and EMBASE databases and secondary referencing qualified for inclusion in this study. A random-effects model was used to derive composite relative risk estimates for CHD, stroke, and CVD. Results: During 5–23 y of follow-up of 347,747 subjects, 11,006 developed CHD or stroke. Intake of saturated fat was not associated with an increased risk of CHD, stroke, or CVD. The pooled relative risk estimates that compared extreme quantiles of saturated fat intake were 1.07 (95% CI: 0.96, 1.19; P = 0.22) for CHD, 0.81 (95% CI: 0.62, 1.05; P = 0.11) for stroke, and 1.00 (95% CI: 0.89, 1.11; P = 0.95) for CVD. Consideration of age, sex, and study quality did not change the results. Conclusions: A meta-analysis of prospective epidemiologic studies showed that there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of CHD or CVD. More data are needed to elucidate whether CVD risks are likely to be influenced by the specific nutrients used to replace saturated fat. PMID:20071648

  3. Central venous oxygen saturation: analysis, clinical use and effects on mortality.

    Science.gov (United States)

    Reid, Megan

    2013-09-01

    The aim of this literature review was to provide a clear definition of central venous oxygen saturation (ScvO₂), highlight the differences between ScvO₂ and mixed venous oxygen saturation (SvO₂), show how it can be used clinically and the effect central venous oxygen saturation has on mortality. Many articles concentrate on the individual aspects of ScvO₂, such as its use in early goal-directed therapy, but few provide a full overview of what it means, how to interpret results and how it can be used clinically. Keywords were searched for including central venous oxygen saturation ScvO₂ mixed venous oxygen saturations ScvO₂ early goal-directed therapy sepsis and mortality. Where possible only publications within the last 10 years were used but key publications were not excluded if they were out with this time frame. Central venous oxygen saturation (ScvO₂) is a very important measurement which can be easily taken in a critical care environment by both medical and nursing staff. It provides an understanding of the patient's oxygen delivery, oxygen consumption and cardiac output. It has a key role within early goal-directed therapy and has been shown to decrease mortality when taken and analysed appropriately. This literature review will highlight to nursing staff within the critical care environment the importance of central venous oxygen saturation measurement and interpretation. By raising awareness of the importance of this measurement it is hoped nursing staff will be proactive in both taking this test and analysing the results, therefore facilitating better care for the septic, critically ill patient and improving outcomes for these patients. © 2013 British Association of Critical Care Nurses.

  4. High throughput quantitative phenotyping of plant resistance using chlorophyll fluorescence image analysis.

    Science.gov (United States)

    Rousseau, Céline; Belin, Etienne; Bove, Edouard; Rousseau, David; Fabre, Frédéric; Berruyer, Romain; Guillaumès, Jacky; Manceau, Charles; Jacques, Marie-Agnès; Boureau, Tristan

    2013-06-13

    In order to select for quantitative plant resistance to pathogens, high throughput approaches that can precisely quantify disease severity are needed. Automation and use of calibrated image analysis should provide more accurate, objective and faster analyses than visual assessments. In contrast to conventional visible imaging, chlorophyll fluorescence imaging is not sensitive to environmental light variations and provides single-channel images prone to a segmentation analysis by simple thresholding approaches. Among the various parameters used in chlorophyll fluorescence imaging, the maximum quantum yield of photosystem II photochemistry (Fv/Fm) is well adapted to phenotyping disease severity. Fv/Fm is an indicator of plant stress that displays a robust contrast between infected and healthy tissues. In the present paper, we aimed at the segmentation of Fv/Fm images to quantify disease severity. Based on the Fv/Fm values of each pixel of the image, a thresholding approach was developed to delimit diseased areas. A first step consisted in setting up thresholds to reproduce visual observations by trained raters of symptoms caused by Xanthomonas fuscans subsp. fuscans (Xff) CFBP4834-R on Phaseolus vulgaris cv. Flavert. In order to develop a thresholding approach valuable on any cultivars or species, a second step was based on modeling pixel-wise Fv/Fm-distributions as mixtures of Gaussian distributions. Such a modeling may discriminate various stages of the symptom development but over-weights artifacts that can occur on mock-inoculated samples. Therefore, we developed a thresholding approach based on the probability of misclassification of a healthy pixel. Then, a clustering step is performed on the diseased areas to discriminate between various stages of alteration of plant tissues. Notably, the use of chlorophyll fluorescence imaging could detect pre-symptomatic area. The interest of this image analysis procedure for assessing the levels of quantitative resistance

  5. ESSENTIALS: Software for Rapid Analysis of High Throughput Transposon Insertion Sequencing Data

    Science.gov (United States)

    Zomer, Aldert; Burghout, Peter; Bootsma, Hester J.; Hermans, Peter W. M.; van Hijum, Sacha A. F. T.

    2012-01-01

    High-throughput analysis of genome-wide random transposon mutant libraries is a powerful tool for (conditional) essential gene discovery. Recently, several next-generation sequencing approaches, e.g. Tn-seq/INseq, HITS and TraDIS, have been developed that accurately map the site of transposon insertions by mutant-specific amplification and sequence readout of DNA flanking the transposon insertions site, assigning a measure of essentiality based on the number of reads per insertion site flanking sequence or per gene. However, analysis of these large and complex datasets is hampered by the lack of an easy to use and automated tool for transposon insertion sequencing data. To fill this gap, we developed ESSENTIALS, an open source, web-based software tool for researchers in the genomics field utilizing transposon insertion sequencing analysis. It accurately predicts (conditionally) essential genes and offers the flexibility of using different sample normalization methods, genomic location bias correction, data preprocessing steps, appropriate statistical tests and various visualizations to examine the results, while requiring only a minimum of input and hands-on work from the researcher. We successfully applied ESSENTIALS to in-house and published Tn-seq, TraDIS and HITS datasets and we show that the various pre- and post-processing steps on the sequence reads and count data with ESSENTIALS considerably improve the sensitivity and specificity of predicted gene essentiality. PMID:22900082

  6. High-throughput analysis of lectin-oligosaccharide interactions by automated frontal affinity chromatography.

    Science.gov (United States)

    Nakamura-Tsuruta, Sachiko; Uchiyama, Noboru; Hirabayashi, Jun

    2006-01-01

    Frontal affinity chromatography (FAC) is a quantitative method that enables sensitive and reproducible measurements of interactions between lectins and oligosaccharides. The method is suitable even for the measurement of low-affinity interactions and is based on a simple procedure and a clear principle. To achieve high-throughput and efficient analysis, an automated FAC system was developed. The system designated FAC-1 consists of two isocratic pumps, an autosampler, and a couple of miniature columns (bed volume, 31.4 microl) connected in parallel to either a fluorescence or an ultraviolet detector. By use of this parallel-column system, the time required for each analysis was reduced substantially. Under the established conditions, fewer than 10 hrs are required for 100 interaction analyses, consuming as little as 1 pmol pyridylaminated oligosaccharide for each analysis. This strategy for FAC should contribute to the construction of a lectin-oligosaccharide interaction database essential for future glycomics. Overall features and practical protocols for interaction analyses using FAC-1 are described.

  7. PhenStat: A Tool Kit for Standardized Analysis of High Throughput Phenotypic Data.

    Directory of Open Access Journals (Sweden)

    Natalja Kurbatova

    Full Text Available The lack of reproducibility with animal phenotyping experiments is a growing concern among the biomedical community. One contributing factor is the inadequate description of statistical analysis methods that prevents researchers from replicating results even when the original data are provided. Here we present PhenStat--a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations. The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation. PhenStat is targeted to two user groups: small-scale users who wish to interact and test data from large resources and large-scale users who require an automated statistical analysis pipeline. The software provides guidance to the user for selecting appropriate analysis methods based on the dataset and is designed to allow for additions and modifications as needed. The package was tested on mouse and rat data and is used by the International Mouse Phenotyping Consortium (IMPC. By providing raw data and the version of PhenStat used, resources like the IMPC give users the ability to replicate and explore results within their own computing environment.

  8. Rapid and sensitive solid phase extraction-large volume injection-gas chromatography for the analysis of mineral oil saturated and aromatic hydrocarbons in cardboard and dried foods.

    Science.gov (United States)

    Moret, Sabrina; Barp, Laura; Purcaro, Giorgia; Conte, Lanfranco S

    2012-06-22

    A rapid off-line solid phase extraction-large volume injection-gas chromatography-flame ionisation detection (SPE-LVI-GC-FID) method, based on the use of silver silica gel and low solvent consumption, was developed for mineral oil saturated hydrocarbon (MOSH) and mineral oil aromatic hydrocarbon (MOAH) determination in cardboard and dried foods packaged in cardboard. The SPE method was validated using LVI with a conventional on-column injector and the retention gap technique (which allowed to inject up to 50 μL of the sample). Detector response was linear over all the concentration range tested (0.5-250 μg/mL), recoveries were practically quantitative, repeatability was good (coefficients of variation lower than 7%) and limit of quantification adequate to quantify the envisioned limit of 0.15 mg/kg proposed in Germany for MOAH analysis in food samples packaged in recycled cardboard. Rapid heating of the GC oven allowed to increase sample throughput (3-4 samples per hour) and to enhance sensitivity. The proposed method was used for MOSH and MOAH determination in selected food samples usually commercialised in cardboard packaging. The most contaminated was a tea sample (102.2 and 7.9 mg/kg of MOSH and MOAH below n-C25, respectively), followed by a rice and a sugar powder sample, all packaged in recycled cardboard. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Torque Analysis With Saturation Effects for Non-Salient Single-Phase Permanent-Magnet Machines

    DEFF Research Database (Denmark)

    Lu, Kaiyuan; Ritchie, Ewen

    2011-01-01

    and easier to calculate the torque using the analytical equation proposed. It can also handle arbitrary armature current waveforms and arbitrary flux linkage waveforms produced by the rotor permanent magnets. The results obtained by using the proposed torque equation are validated using finite......The effects of saturation on torque production for non-salient, single-phase, permanent-magnet machines are studied in this paper. An analytical torque equation is proposed to predict the instantaneous torque with saturation effects. Compared to the existing methods, it is computationally faster...

  10. Production responses of Holstein dairy cows when fed supplemental fat containing saturated free fatty acids: a meta-analysis.

    Science.gov (United States)

    Hu, Wenping; Boerman, Jacquelyn P; Aldrich, James M

    2017-08-01

    A meta-analysis was conducted to evaluate the effects of supplemental fat containing saturated free fatty acids (FA) on milk performance of Holstein dairy cows. A database was developed from 21 studies published between 1991 and 2016 that included 502 dairy cows and a total of 29 to 30 comparisons between dietary treatment and control without fat supplementation. Only saturated free FA (>80% of total FA) was considered as the supplemental fat. Concentration of the supplemental fat was not higher than 3.5% of diet dry matter (DM). Dairy cows were offered total mixed ration, and fed individually. Statistical analysis was conducted using random- or mixed-effects models with Metafor package in R. Sub-group analysis showed that there were no differences in studies between randomized block design and Latin square/crossover design for dry matter intake (DMI) and milk production responses to the supplemental fat (all response variables, p≥0.344). The supplemental fat across all studies improved milk yield, milk fat concentration and yield, and milk protein yield by 1.684 kg/d (pproduction responses to the supplemental fat (all response variables, I2≤24.1%; p≥0.166). The effects of saturated free FA were quantitatively evaluated. Higher milk production and yields of milk fat and protein, with DMI remaining unchanged, indicated that saturated free FA, supplemented at ≤3.5% dietary DM from commercially available fat sources, likely improved the efficiency of milk production. Nevertheless, more studies are needed to assess the variation of production responses to different saturated free FA, either C16:0 or C18:0 alone, or in combination with potentially optimal ratio, when supplemented in dairy cow diets.

  11. High-throughput analysis of drug dissociation from serum proteins using affinity silica monoliths.

    Science.gov (United States)

    Yoo, Michelle J; Hage, David S

    2011-08-01

    A noncompetitive peak decay method was used with 1 mm×4.6 mm id silica monoliths to measure the dissociation rate constants (kd) for various drugs with human serum albumin (HSA) and α1-acid glycoprotein (AGP). Flow rates up to 9 mL/min were used in these experiments, resulting in analysis times of only 20-30 s. Using a silica monolith containing immobilized HSA, dissociation rate constants were measured for amitriptyline, carboplatin, cisplatin, chloramphenicol, nortriptyline, quinidine, and verapamil, giving values that ranged from 0.37 to 0.78 s(-1). Similar work with an immobilized AGP silica monolith gave kd values for amitriptyline, nortriptyline, and lidocaine of 0.39-0.73 s(-1). These kd values showed good agreement with values determined for drugs with similar structures and/or affinities for HSA or AGP. It was found that a kd of up to roughly 0.80 s(-1) could be measured by this approach. This information made it possible to obtain a better understanding of the advantages and possible limitations of the noncompetitive peak decay method and in the use of affinity silica monoliths for the high-throughput analysis of drug-protein dissociation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay.

    Science.gov (United States)

    Zhong, Xuefeng; Chen, Fangjiong; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-16

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter-receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter-receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity.

  13. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  14. High-throughput Analysis of Large Microscopy Image Datasets on CPU-GPU Cluster Platforms.

    Science.gov (United States)

    Teodoro, George; Pan, Tony; Kurc, Tahsin M; Kong, Jun; Cooper, Lee A D; Podhorszki, Norbert; Klasky, Scott; Saltz, Joel H

    2013-05-01

    Analysis of large pathology image datasets offers significant opportunities for the investigation of disease morphology, but the resource requirements of analysis pipelines limit the scale of such studies. Motivated by a brain cancer study, we propose and evaluate a parallel image analysis application pipeline for high throughput computation of large datasets of high resolution pathology tissue images on distributed CPU-GPU platforms. To achieve efficient execution on these hybrid systems, we have built runtime support that allows us to express the cancer image analysis application as a hierarchical data processing pipeline. The application is implemented as a coarse-grain pipeline of stages, where each stage may be further partitioned into another pipeline of fine-grain operations. The fine-grain operations are efficiently managed and scheduled for computation on CPUs and GPUs using performance aware scheduling techniques along with several optimizations, including architecture aware process placement, data locality conscious task assignment, data prefetching, and asynchronous data copy. These optimizations are employed to maximize the utilization of the aggregate computing power of CPUs and GPUs and minimize data copy overheads. Our experimental evaluation shows that the cooperative use of CPUs and GPUs achieves significant improvements on top of GPU-only versions (up to 1.6×) and that the execution of the application as a set of fine-grain operations provides more opportunities for runtime optimizations and attains better performance than coarser-grain, monolithic implementations used in other works. An implementation of the cancer image analysis pipeline using the runtime support was able to process an image dataset consisting of 36,848 4Kx4K-pixel image tiles (about 1.8TB uncompressed) in less than 4 minutes (150 tiles/second) on 100 nodes of a state-of-the-art hybrid cluster system.

  15. Blood gas analysis, blood saturation and chosen parameters of spirometric examination in NSCLC patients undergoing chemotherapy and pulmonary rehabilitation.

    Science.gov (United States)

    Tokarski, Sławomir; Tokarska, Kamila; Schwarz, Ewa; Obrebska, Agnieszka; Mejer, Anna; Kowalski, Jan

    2014-04-01

    In industrialized countries lung cancer is associated with highest mortality among carcinoma. Progression of the disease is associated with diminished tolerance for physical activities, aggravated dyspnea and lowering of life quality. The aim of study was the evaluation of blood gas, blood saturation and chosen parameters of spirometric examination in NSCLC patients undergoing chemotherapy and pulmonary rehabilitation. Analysis of capillary blood was done using RapidPoint 405 Siemens device. Spirometric examination was done using PNEUMO abcMED device. Forty-nine patients with inoperable NSCLC were subjected to the examination. This included 38 men and 11 women aged between 46-75 years (mean age 63 +/- 7.5 years) who were separated into two groups: group I--25 patients undergoing standard chemotherapy (group C); group II--24 patients undergoing standard chemotherapy and pulmonary rehabilitation (group CK). All patients were subjected to blood gas analysis, blood saturation analysis and spirometric examination twice, before and after first-line chemotherapy Increase of pO2 and SaO2 in blood, and FEV1 and FVC in spirometric examination was significantly higher in patients undergoing pulmonary rehabilitation and chemotherapy (group II) (p blood gas, blood saturation analysis and chosen parameters of spirometric analysis. Pulmonary rehabilitation in patients with lung cancer seems to be an important form of supplementary treatment.

  16. Transient performances analysis of wind turbine system with induction generator including flux saturation and skin effect

    DEFF Research Database (Denmark)

    Li, H.; Zhao, B.; Han, L.

    2010-01-01

    In order to analyze correctly the effect of different models for induction generators on the transient performances of large wind power generation, Wind turbine driven squirrel cage induction generator (SCIG) models taking into account both main and leakage flux saturation and skin effect were pr...

  17. Design, Analysis and Simulation of Magnetic Biased Inductors with Saturation-Gap

    DEFF Research Database (Denmark)

    Aguilar, Andres Revilla; Munk-Nielsen, Stig

    2014-01-01

    Permanent magnet biasing, is a known technique for increasing the energy storage capability of inductors operating in DC applications. The opposing flux introduced by a permanent magnet will extend the saturation flux limit of a given magnetic material. When full biasing of the core is achieved...

  18. Design and Performance Analysis of Multi-tier Heterogeneous Network through Coverage, Throughput and Energy Efficiency

    Directory of Open Access Journals (Sweden)

    A. Shabbir,

    2017-12-01

    Full Text Available The unprecedented acceleration in wireless industry strongly compels wireless operators to increase their data network throughput, capacity and coverage on emergent basis. In upcoming 5G heterogeneous networks inclusion of low power nodes (LPNs like pico cells and femto cells for increasing network’s throughput, capacity and coverage are getting momentum. Addition of LPNs in such a massive level will eventually make a network populated in terms of base stations (BSs.The dense deployments of BSs will leads towards high operating expenditures (Op-Ex, capital expenditure (Cap-Ex and most importantly high energy consumption in future generation networks. Recognizing theses networks issues this research work investigates data throughput and energy efficiency of 5G multi-tier heterogeneous network. The network is modeled using tools from stochastic geometry. Monte Carlo results confirmed that rational deployment of LPNs can contribute towards increased throughput along with better energy efficiency of overall network.

  19. Research on combination forecast of port cargo throughput based on time series and causality analysis

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2013-03-01

    Full Text Available Purpose: The purpose of this paper is to develop a combined model composed of grey-forecast model and Logistic-growth-curve model to improve the accuracy of forecast model of cargo throughput for the port. The authors also use the existing data of a current port to verify the validity of the combined model.Design/methodology/approach: A literature review is undertaken to find the appropriate forecast model of cargo throughput for the port. Through researching the related forecast model, the authors put together the individual models which are significant to study further. Finally, the authors combine two individual models (grey-forecast model and Logistic-growth-curve model into one combined model to forecast the port cargo throughput, and use the model to a physical port in China to testify the validity of the model.Findings: Test by the perceptional data of cargo throughput in the physical port, the results show that the combined model can obtain relatively higher forecast accuracy when it is not easy to find more information. Furthermore, the forecast made by the combined model are more accurate than any of the individual ones.Research limitations/implications: The study provided a new combined forecast model of cargo throughput with a relatively less information to improve the accuracy rate of the forecast. The limitation of the model is that it requires the cargo throughput of the port have an S-shaped change trend.Practical implications: This model is not limited by external conditions such as geographical, cultural. This model predicted the port cargo throughput of one real port in China in 2015, which provided some instructive guidance for the port development.Originality/value: This is the one of the study to improve the accuracy rate of the cargo throughput forecast with little information.

  20. High-throughput analysis reveals novel maternal germline RNAs crucial for primordial germ cell preservation and proper migration

    OpenAIRE

    Owens, Dawn A.; Butler, Amanda M.; Aguero, Tristan H.; Newman, Karen M.; Van Booven, Derek; King, Mary Lou

    2017-01-01

    During oogenesis, hundreds of maternal RNAs are selectively localized to the animal or vegetal pole, including determinants of somatic and germline fates. Although microarray analysis has identified localized determinants, it is not comprehensive and is limited to known transcripts. Here, we utilized high-throughput RNA-sequencing analysis to comprehensively interrogate animal and vegetal pole RNAs in the fully grown Xenopus laevis oocyte. We identified 411 (198 annotated) and 27 (15 annotate...

  1. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    Science.gov (United States)

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  2. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  3. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data.

    Science.gov (United States)

    Thiel, William H

    2016-01-01

    Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment). High-throughput sequencing (HTS) revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs. Copyright © 2016 Official journal of the American Society of Gene & Cell Therapy. Published by Elsevier Inc. All rights reserved.

  4. A Concept for a Sensitive Micro Total Analysis System for High Throughput Fluorescence Imaging

    Directory of Open Access Journals (Sweden)

    Yosi Shacham

    2006-04-01

    Full Text Available This paper discusses possible methods for on-chip fluorescent imaging forintegrated bio-sensors. The integration of optical and electro-optical accessories, accordingto suggested methods, can improve the performance of fluorescence imaging. It can boostthe signal to background ratio by a few orders of magnitudes in comparison to conventionaldiscrete setups. The methods that are present in this paper are oriented towards buildingreproducible arrays for high-throughput micro total analysis systems (μTAS. The firstmethod relates to side illumination of the fluorescent material placed into micro-compartments of the lab-on-chip. Its significance is in high utilization of excitation energyfor low concentration of fluorescent material. The utilization of a transparent μLED chip,for the second method, allows the placement of the excitation light sources on the sameoptical axis with emission detector, such that the excitation and emission rays are directedcontroversly. The third method presents a spatial filtering of the excitation background.

  5. Microscopic description of oxide perovskites and automated high-throughput analysis of their energy landscape

    Science.gov (United States)

    Pizzi, Giovanni; Cepellotti, Andrea; Kozinsky, Boris; Marzari, Nicola

    Even if ferroelectric materials like BaTiO3 or KNbO3 have been used for decades in a broad range of technological applications, there is still significant debate in the literature concerning their microscopic behavior. For instance, many perovskite materials display a high-temperature cubic phase with zero net polarization, but its microscopic nature is though still unclear, with some materials displaying a very complex energy landscape with multiple local minima. In order to investigate and clarify the microscopic nature of oxide perovskites, we perform a study on a set of about 50 representative ABO3 systems. We use spacegroup techniques to systematically analyze all possible local displacement patterns that are compatible with a net paraelectric phase, but can provide local non-zero ferroelectric moments. The energetics and the stability of these patterns is then assessed by combining the spacegroup analysis with DFT calculations. All calculations are managed and analyzed using our high-throughput platform AiiDA (www.aiida.net). Using this technique, we are able to describe the different classes of microscopic models underlying the perovskite systems

  6. MultiSense: A Multimodal Sensor Tool Enabling the High-Throughput Analysis of Respiration.

    Science.gov (United States)

    Keil, Peter; Liebsch, Gregor; Borisjuk, Ljudmilla; Rolletschek, Hardy

    2017-01-01

    The high-throughput analysis of respiratory activity has become an important component of many biological investigations. Here, a technological platform, denoted the "MultiSense tool," is described. The tool enables the parallel monitoring of respiration in 100 samples over an extended time period, by dynamically tracking the concentrations of oxygen (O2) and/or carbon dioxide (CO2) and/or pH within an airtight vial. Its flexible design supports the quantification of respiration based on either oxygen consumption or carbon dioxide release, thereby allowing for the determination of the physiologically significant respiratory quotient (the ratio between the quantities of CO2 released and the O2 consumed). It requires an LED light source to be mounted above the sample, together with a CCD camera system, adjusted to enable the capture of analyte-specific wavelengths, and fluorescent sensor spots inserted into the sample vial. Here, a demonstration is given of the use of the MultiSense tool to quantify respiration in imbibing plant seeds, for which an appropriate step-by-step protocol is provided. The technology can be easily adapted for a wide range of applications, including the monitoring of gas exchange in any kind of liquid culture system (algae, embryo and tissue culture, cell suspensions, microbial cultures).

  7. Vidjil: A Web Platform for Analysis of High-Throughput Repertoire Sequencing.

    Science.gov (United States)

    Duez, Marc; Giraud, Mathieu; Herbert, Ryan; Rocher, Tatiana; Salson, Mikaël; Thonier, Florian

    2016-01-01

    The B and T lymphocytes are white blood cells playing a key role in the adaptive immunity. A part of their DNA, called the V(D)J recombinations, is specific to each lymphocyte, and enables recognition of specific antigenes. Today, with new sequencing techniques, one can get billions of DNA sequences from these regions. With dedicated Repertoire Sequencing (RepSeq) methods, it is now possible to picture population of lymphocytes, and to monitor more accurately the immune response as well as pathologies such as leukemia. Vidjil is an open-source platform for the interactive analysis of high-throughput sequencing data from lymphocyte recombinations. It contains an algorithm gathering reads into clonotypes according to their V(D)J junctions, a web application made of a sample, experiment and patient database and a visualization for the analysis of clonotypes along the time. Vidjil is implemented in C++, Python and Javascript and licensed under the GPLv3 open-source license. Source code, binaries and a public web server are available at http://www.vidjil.org and at http://bioinfo.lille.inria.fr/vidjil. Using the Vidjil web application consists of four steps: 1. uploading a raw sequence file (typically a FASTQ); 2. running RepSeq analysis software; 3. visualizing the results; 4. annotating the results and saving them for future use. For the end-user, the Vidjil web application needs no specific installation and just requires a connection and a modern web browser. Vidjil is used by labs in hematology or immunology for research and clinical applications.

  8. Forensic soil DNA analysis using high-throughput sequencing: a comparison of four molecular markers.

    Science.gov (United States)

    Young, Jennifer M; Weyrich, Laura S; Cooper, Alan

    2014-11-01

    Soil analysis, such as mineralogy, geophysics, texture and colour, are commonly used in forensic casework to link a suspect to a crime scene. However, DNA analysis can also be applied to characterise the vast diversity of organisms present in soils. DNA metabarcoding and high-throughput sequencing (HTS) now offer a means to improve discrimination between forensic soil samples by identifying individual taxa and exploring non-culturable microbial species. Here, we compare the small-scale reproducibility and resolution of four molecular markers targeting different taxa (bacterial 16S rRNA, eukaryotic18S rRNA, plant trnL intron and fungal internal transcribed spacer I (ITS1) rDNA) to distinguish two sample sites. We also assess the background DNA level associated with each marker and examine the effects of filtering Operational Taxonomic Units (OTUs) detected in extraction blank controls. From this study, we show that non-bacterial taxa in soil, particularly fungi, can provide the greatest resolution between the sites, whereas plant markers may be problematic for forensic discrimination. ITS and 18S markers exhibit reliable amplification, and both show high discriminatory power with low background DNA levels. The 16S rRNA marker showed comparable discriminatory power post filtering; however, presented the highest level of background DNA. The discriminatory power of all markers was increased by applying OTU filtering steps, with the greatest improvement observed by the removal of any sequences detected in extraction blanks. This study demonstrates the potential use of multiple DNA markers for forensic soil analysis using HTS, and identifies some of the standardisation and evaluation steps necessary before this technique can be applied in casework. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Settlement Analysis of Saturated Tailings Dam Treated by CFG Pile Composite Foundation

    Directory of Open Access Journals (Sweden)

    Jinxing Lai

    2016-01-01

    Full Text Available Cement fly ash gravel (CFG pile composite foundation is an effective and economic foundation treatment approach, which is significant to building foundation, subgrade construction, and so forth. The present paper aims at investigating the settlement behaviors of saturated tailings dam soft ground under CFG pile composite foundation treatment, in which FEM and laboratory model test were utilized. The proposed findings demonstrate that CFG pile treatment is effective in reinforcing saturated tailings dam and loading has little influence on settlement of soil between piles. The variation of soil between piles settlement in FEM has a good agreement with the laboratory model test. Additionally, the cushion deformation modulus has a small effect on the composite foundation settlement, although the cushion thickness will generate certain influence on the settlement distribution of the composite foundation.

  10. ZebraZoom: an automated program for high-throughput behavioral analysis and categorization

    Directory of Open Access Journals (Sweden)

    Olivier eMirat

    2013-06-01

    Full Text Available The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 (lak revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorizes all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed 73.2-82.5% of cases with four independent experimenters. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva-larva interactions occurred as series of escapes. Overall, ZebraZoom reaches the level of precision found in manual analysis but accomplishes tasks in a high-throughput format necessary for large screens.

  11. WormSizer: high-throughput analysis of nematode size and shape.

    Directory of Open Access Journals (Sweden)

    Brad T Moore

    Full Text Available The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy mutant is short and fat and that a Long (Lon mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc and Roller (Rol mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.

  12. NucTools: analysis of chromatin feature occupancy profiles from high-throughput sequencing data.

    Science.gov (United States)

    Vainshtein, Yevhen; Rippe, Karsten; Teif, Vladimir B

    2017-02-14

    Biomedical applications of high-throughput sequencing methods generate a vast amount of data in which numerous chromatin features are mapped along the genome. The results are frequently analysed by creating binary data sets that link the presence/absence of a given feature to specific genomic loci. However, the nucleosome occupancy or chromatin accessibility landscape is essentially continuous. It is currently a challenge in the field to cope with continuous distributions of deep sequencing chromatin readouts and to integrate the different types of discrete chromatin features to reveal linkages between them. Here we introduce the NucTools suite of Perl scripts as well as MATLAB- and R-based visualization programs for a nucleosome-centred downstream analysis of deep sequencing data. NucTools accounts for the continuous distribution of nucleosome occupancy. It allows calculations of nucleosome occupancy profiles averaged over several replicates, comparisons of nucleosome occupancy landscapes between different experimental conditions, and the estimation of the changes of integral chromatin properties such as the nucleosome repeat length. Furthermore, NucTools facilitates the annotation of nucleosome occupancy with other chromatin features like binding of transcription factors or architectural proteins, and epigenetic marks like histone modifications or DNA methylation. The applications of NucTools are demonstrated for the comparison of several datasets for nucleosome occupancy in mouse embryonic stem cells (ESCs) and mouse embryonic fibroblasts (MEFs). The typical workflows of data processing and integrative analysis with NucTools reveal information on the interplay of nucleosome positioning with other features such as for example binding of a transcription factor CTCF, regions with stable and unstable nucleosomes, and domains of large organized chromatin K9me2 modifications (LOCKs). As potential limitations and problems we discuss how inter-replicate variability of

  13. Comprehensive Essentiality Analysis of the Mycobacterium tuberculosis Genome via Saturating Transposon Mutagenesis.

    Science.gov (United States)

    DeJesus, Michael A; Gerrick, Elias R; Xu, Weizhen; Park, Sae Woong; Long, Jarukit E; Boutte, Cara C; Rubin, Eric J; Schnappinger, Dirk; Ehrt, Sabine; Fortune, Sarah M; Sassetti, Christopher M; Ioerger, Thomas R

    2017-01-17

    For decades, identifying the regions of a bacterial chromosome that are necessary for viability has relied on mapping integration sites in libraries of random transposon mutants to find loci that are unable to sustain insertion. To date, these studies have analyzed subsaturated libraries, necessitating the application of statistical methods to estimate the likelihood that a gap in transposon coverage is the result of biological selection and not the stochasticity of insertion. As a result, the essentiality of many genomic features, particularly small ones, could not be reliably assessed. We sought to overcome this limitation by creating a completely saturated transposon library in Mycobacterium tuberculosis In assessing the composition of this highly saturated library by deep sequencing, we discovered that a previously unknown sequence bias of the Himar1 element rendered approximately 9% of potential TA dinucleotide insertion sites less permissible for insertion. We used a hidden Markov model of essentiality that accounted for this unanticipated bias, allowing us to confidently evaluate the essentiality of features that contained as few as 2 TA sites, including open reading frames (ORF), experimentally identified noncoding RNAs, methylation sites, and promoters. In addition, several essential regions that did not correspond to known features were identified, suggesting uncharacterized functions that are necessary for growth. This work provides an authoritative catalog of essential regions of the M. tuberculosis genome and a statistical framework for applying saturating mutagenesis to other bacteria. Sequencing of transposon-insertion mutant libraries has become a widely used tool for probing the functions of genes under various conditions. The Himar1 transposon is generally believed to insert with equal probabilities at all TA dinucleotides, and therefore its absence in a mutant library is taken to indicate biological selection against the corresponding mutant

  14. Dye Tracer Technique and Color Image Analysis For Describing Saturation State and 3d Axi-symmetrical Flow Pattern

    Science.gov (United States)

    Abriak, N. E.; Gandola, F.; Haverkamp, R.

    Dye tracer techniques have been widely used for visualising water flow pattern in soils and particularly, for determining the volumetric water content in a one dimensional and two dimensional laboratory experiments. The present study deals a 3 dimensional laboratory experiment (axi-symmetrical condition) using color visualisation technique and the image analysis technique for determining the spatial distribution of the water content. The infiltration of a dye (fluorescein) mixed with water is achieved under ax- isymmetrical condition in a Plexiglas tank (50t'50t'60cm) filled with a low saturated sand. Both infiltration and drainage processes are visualised under blue light condi- tion and recorded on videotape. The image analysis technique used for determining the saturation state is based on the use of a limited colors palette which allows to quan- tify the evolution of the saturation state in the sand. Simultaneously, nine tensiometers connected to a data acquisition system, are used to determine the negative water pres- sure in the sand. The measurement of the succion values confirms the existence of a second water wetting front (after the dye flow) due to the initial mobile water content in the sand.

  15. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    Directory of Open Access Journals (Sweden)

    Husted Søren

    2009-09-01

    Full Text Available Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. Results We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight. A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds, the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm closely matched the total content obtained by analysis of the whole rice grain. Conclusion A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at

  16. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis.

    Science.gov (United States)

    Hansen, Thomas H; Laursen, Kristian H; Persson, Daniel P; Pedas, Pai; Husted, Søren; Schjoerring, Jan K

    2009-09-26

    Quantitative multi-elemental analysis by inductively coupled plasma (ICP) spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight). A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM) with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds), the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm) closely matched the total content obtained by analysis of the whole rice grain. A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at improvement of the micronutrient density in edible

  17. Throughput performance analysis of multirate, multiclass S-ALOHA OFFH-CDMA packet networks

    DEFF Research Database (Denmark)

    Raddo, Thiago R.; Sanches, Anderson L.; Borges, Ben Hur V

    2015-01-01

    In this paper, we propose a new throughput expression for multirate, multiclass slotted-ALOHA optical fast frequency hopping code-division multiple-access (OFFH-CDMA) packet networks considering a Poisson distribution for packet composite arrivals. We analyze the packet throughput performance...... of a three-class OFFH-CDMA network, where multirate transmissions are achieved via manipulation of the user's code parameters. It is shown that users transmitting at low rates interfere considerably in the performance of high rate users. Finally, we perform a validation procedure to demonstrate...

  18. Optimizing MRI Logistics: Prospective Analysis of Performance, Efficiency, and Patient Throughput.

    Science.gov (United States)

    Beker, Kevin; Garces-Descovich, Alejandro; Mangosing, Jason; Cabral-Goncalves, Ines; Hallett, Donna; Mortele, Koenraad J

    2017-10-01

    The objective of this study is to optimize MRI logistics through evaluation of MRI workflow and analysis of performance, efficiency, and patient throughput in a tertiary care academic center. For 2 weeks, workflow data from two outpatient MRI scanners were prospectively collected and stratified by value added to the process (i.e., value-added time, business value-added time, or non-value-added time). Two separate time cycles were measured: the actual MRI process cycle as well as the complete length of patient stay in the department. In addition, the impact and frequency of delays across all observations were measured. A total of 305 MRI examinations were evaluated, including body (34.1%), neurologic (28.9%), musculoskeletal (21.0%), and breast examinations (16.1%). The MRI process cycle lasted a mean of 50.97 ± 24.4 (SD) minutes per examination; the mean non-value-added time was 13.21 ± 18.77 minutes (25.87% of the total process cycle time). The mean length-of-stay cycle was 83.51 ± 33.63 minutes; the mean non-value-added time was 24.33 ± 24.84 minutes (29.14% of the total patient stay). The delay with the highest frequency (5.57%) was IV or port placement, which had a mean delay of 22.82 minutes. The delay with the greatest impact on time was MRI arthrography for which joint injection of contrast medium was necessary but was not accounted for in the schedule (mean delay, 42.2 minutes; frequency, 1.64%). Of 305 patients, 34 (11.15%) did not arrive at or before their scheduled time. Non-value-added time represents approximately one-third of the total MRI process cycle and patient length of stay. Identifying specific delays may expedite the application of targeted improvement strategies, potentially increasing revenue, efficiency, and overall patient satisfaction.

  19. Identification of microRNAs from Eugenia uniflora by high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Guzman, Frank; Almerão, Mauricio P; Körbes, Ana P; Loss-Morais, Guilherme; Margis, Rogerio

    2012-01-01

    microRNAs or miRNAs are small non-coding regulatory RNAs that play important functions in the regulation of gene expression at the post-transcriptional level by targeting mRNAs for degradation or inhibiting protein translation. Eugenia uniflora is a plant native to tropical America with pharmacological and ecological importance, and there have been no previous studies concerning its gene expression and regulation. To date, no miRNAs have been reported in Myrtaceae species. Small RNA and RNA-seq libraries were constructed to identify miRNAs and pre-miRNAs in Eugenia uniflora. Solexa technology was used to perform high throughput sequencing of the library, and the data obtained were analyzed using bioinformatics tools. From 14,489,131 small RNA clean reads, we obtained 1,852,722 mature miRNA sequences representing 45 conserved families that have been identified in other plant species. Further analysis using contigs assembled from RNA-seq allowed the prediction of secondary structures of 25 known and 17 novel pre-miRNAs. The expression of twenty-seven identified miRNAs was also validated using RT-PCR assays. Potential targets were predicted for the most abundant mature miRNAs in the identified pre-miRNAs based on sequence homology. This study is the first large scale identification of miRNAs and their potential targets from a species of the Myrtaceae family without genomic sequence resources. Our study provides more information about the evolutionary conservation of the regulatory network of miRNAs in plants and highlights species-specific miRNAs.

  20. Live imaging of muscles in Drosophila metamorphosis: Towards high-throughput gene identification and function analysis.

    Science.gov (United States)

    Puah, Wee Choo; Wasser, Martin

    2016-03-01

    Time-lapse microscopy in developmental biology is an emerging tool for functional genomics. Phenotypic effects of gene perturbations can be studied non-invasively at multiple time points in chronological order. During metamorphosis of Drosophila melanogaster, time-lapse microscopy using fluorescent reporters allows visualization of alternative fates of larval muscles, which are a model for the study of genes related to muscle wasting. While doomed muscles enter hormone-induced programmed cell death, a smaller population of persistent muscles survives to adulthood and undergoes morphological remodeling that involves atrophy in early, and hypertrophy in late pupation. We developed a method that combines in vivo imaging, targeted gene perturbation and image analysis to identify and characterize genes involved in muscle development. Macrozoom microscopy helps to screen for interesting muscle phenotypes, while confocal microscopy in multiple locations over 4-5 days produces time-lapse images that are used to quantify changes in cell morphology. Performing a similar investigation using fixed pupal tissues would be too time-consuming and therefore impractical. We describe three applications of our pipeline. First, we show how quantitative microscopy can track and measure morphological changes of muscle throughout metamorphosis and analyze genes involved in atrophy. Second, our assay can help to identify genes that either promote or prevent histolysis of abdominal muscles. Third, we apply our approach to test new fluorescent proteins as live markers for muscle development. We describe mKO2 tagged Cysteine proteinase 1 (Cp1) and Troponin-I (TnI) as examples of proteins showing developmental changes in subcellular localization. Finally, we discuss strategies to improve throughput of our pipeline to permit genome-wide screens in the future. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Production responses of Holstein dairy cows when fed supplemental fat containing saturated free fatty acids: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Wenping Hu

    2017-08-01

    Full Text Available Objective A meta-analysis was conducted to evaluate the effects of supplemental fat containing saturated free fatty acids (FA on milk performance of Holstein dairy cows. Methods A database was developed from 21 studies published between 1991 and 2016 that included 502 dairy cows and a total of 29 to 30 comparisons between dietary treatment and control without fat supplementation. Only saturated free FA (>80% of total FA was considered as the supplemental fat. Concentration of the supplemental fat was not higher than 3.5% of diet dry matter (DM. Dairy cows were offered total mixed ration, and fed individually. Statistical analysis was conducted using random- or mixed-effects models with Metafor package in R. Results Sub-group analysis showed that there were no differences in studies between randomized block design and Latin square/crossover design for dry matter intake (DMI and milk production responses to the supplemental fat (all response variables, p≥0.344. The supplemental fat across all studies improved milk yield, milk fat concentration and yield, and milk protein yield by 1.684 kg/d (p<0.001, 0.095 percent unit (p = 0.003, 0.072 kg/d (p<0.001, and 0.036 kg/d (p<0.001, respectively, but tended to decrease milk protein concentration (mean difference = −0.022 percent unit; p = 0.063 while DMI (mean difference = 0.061 kg/d; p = 0.768 remained unchanged. The assessment of heterogeneity suggested that no substantial heterogeneity occurred among all studies for DMI and milk production responses to the supplemental fat (all response variables, I2≤24.1%; p≥0.166. Conclusion The effects of saturated free FA were quantitatively evaluated. Higher milk production and yields of milk fat and protein, with DMI remaining unchanged, indicated that saturated free FA, supplemented at ≤3.5% dietary DM from commercially available fat sources, likely improved the efficiency of milk production. Nevertheless, more studies are needed to assess the

  2. High-throughput genome editing and phenotyping facilitated by high resolution melting curve analysis.

    Science.gov (United States)

    Thomas, Holly R; Percival, Stefanie M; Yoder, Bradley K; Parant, John M

    2014-01-01

    With the goal to generate and characterize the phenotypes of null alleles in all genes within an organism and the recent advances in custom nucleases, genome editing limitations have moved from mutation generation to mutation detection. We previously demonstrated that High Resolution Melting (HRM) analysis is a rapid and efficient means of genotyping known zebrafish mutants. Here we establish optimized conditions for HRM based detection of novel mutant alleles. Using these conditions, we demonstrate that HRM is highly efficient at mutation detection across multiple genome editing platforms (ZFNs, TALENs, and CRISPRs); we observed nuclease generated HRM positive targeting in 1 of 6 (16%) open pool derived ZFNs, 14 of 23 (60%) TALENs, and 58 of 77 (75%) CRISPR nucleases. Successful targeting, based on HRM of G0 embryos correlates well with successful germline transmission (46 of 47 nucleases); yet, surprisingly mutations in the somatic tail DNA weakly correlate with mutations in the germline F1 progeny DNA. This suggests that analysis of G0 tail DNA is a good indicator of the efficiency of the nuclease, but not necessarily a good indicator of germline alleles that will be present in the F1s. However, we demonstrate that small amplicon HRM curve profiles of F1 progeny DNA can be used to differentiate between specific mutant alleles, facilitating rare allele identification and isolation; and that HRM is a powerful technique for screening possible off-target mutations that may be generated by the nucleases. Our data suggest that micro-homology based alternative NHEJ repair is primarily utilized in the generation of CRISPR mutant alleles and allows us to predict likelihood of generating a null allele. Lastly, we demonstrate that HRM can be used to quickly distinguish genotype-phenotype correlations within F1 embryos derived from G0 intercrosses. Together these data indicate that custom nucleases, in conjunction with the ease and speed of HRM, will facilitate future high-throughput

  3. High-throughput genome editing and phenotyping facilitated by high resolution melting curve analysis.

    Directory of Open Access Journals (Sweden)

    Holly R Thomas

    facilitate future high-throughput mutation generation and analysis needed to establish mutants in all genes of an organism.

  4. Investigating Functional Extension of Optical Coherence Tomography for Spectroscopic Analysis of Blood Oxygen Saturation

    Science.gov (United States)

    Chen, Siyu

    Over the past two decades, optical coherence tomography (OCT) has been successfully applied to various fields of biomedical researching and clinical studies, including cardiology, urology, dermatology, dentistry, oncology, and most successfully, ophthalmology. This dissertation seeks to extend the current OCT practice, which is still largely morphology-based, into a new dimension, functional analysis of metabolic activities in vivo. More specifically, the investigation is focused on retrieving blood oxygen saturation (sO2) using intrinsic hemoglobin optical absorption contrast. Most mammalian cells rely on aerobic respiration to support cellular function, which means they consume oxygen to create adenosine triphosphate (ATP). Metabolic rate of oxygen (MRO2), a key hemodynamic parameter, characterizes how much oxygen is consumed during a given period of time, reflecting the metabolic activity of the target tissue. For example, retinal neurons are highly active and almost entirely rely on the moment-to-moment oxygen supply from retinal circulations. Thus, variation in MRO2 reveals the instantaneous activity of these neurons, shedding light on the physiological and pathophysiological change of cellular functions. Eventually, measuring MRO2 can potentially provide a biomarker for early-stage disease diagnosis, and serve as one benchmark for evaluating effectiveness of medical intervention during disease management. Essential in calculating MRO2, blood sO2 measurements using spectroscopic OCT analysis has been attempted as early as 2003. OCT is intrinsically sensitive to the blood optical absorption spectrum due to its wide-band illumination and detection scheme relying on back-scattered photon. However, accurate retrieval of blood sO2 using conventional near infrared (NIR) OCT systems in vivo has remained challenging. It was not until the development of OCT systems using visible light illumination (vis-OCT) when accurate measurement of blood sO2 was reported in live

  5. Analysis of nitrogen saturation potential in Rocky Mountain tundra and forest: implications for aquatic systems

    Science.gov (United States)

    Baron, Jill S.; Ojima, Dennis S.; Holland, Elisabeth A.; Parton, William J.

    1994-01-01

    We employed grass and forest versions of the CENTURY model under a range of N deposition values (0.02–1.60 g N m−2 y−1) to explore the possibility that high observed lake and stream N was due to terrestrial N saturation of alpine tundra and subalpine forest in Loch Vale Watershed, Rocky Mountain National Park, Colorado. Model results suggest that N is limiting to subalpine forest productivity, but that excess leachate from alpine tundra is sufficient to account for the current observed stream N. Tundra leachate, combined with N leached from exposed rock surfaces, produce high N loads in aquatic ecosystems above treeline in the Colorado Front Range. A combination of terrestrial leaching, large N inputs from snowmelt, high watershed gradients, rapid hydrologic flushing and lake turnover times, and possibly other nutrient limitations of aquatic organisms constrain high elevation lakes and streams from assimilating even small increases in atmospheric N. CENTURY model simulations further suggest that, while increased N deposition will worsen the situation, nitrogen saturation is an ongoing phenomenon.

  6. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  7. Predicting gene function through systematic analysis and quality assessment of high-throughput data.

    Science.gov (United States)

    Kemmeren, Patrick; Kockelkorn, Thessa T J P; Bijma, Theo; Donders, Rogier; Holstege, Frank C P

    2005-04-15

    Determining gene function is an important challenge arising from the availability of whole genome sequences. Until recently, approaches based on sequence homology were the only high-throughput method for predicting gene function. Use of high-throughput generated experimental data sets for determining gene function has been limited for several reasons. Here a new approach is presented for integration of high-throughput data sets, leading to prediction of function based on relationships supported by multiple types and sources of data. This is achieved with a database containing 125 different high-throughput data sets describing phenotypes, cellular localizations, protein interactions and mRNA expression levels from Saccharomyces cerevisiae, using a bit-vector representation and information content-based ranking. The approach takes characteristic and qualitative differences between the data sets into account, is highly flexible, efficient and scalable. Database queries result in predictions for 543 uncharacterized genes, based on multiple functional relationships each supported by at least three types of experimental data. Some of these are experimentally verified, further demonstrating their reliability. The results also generate insights into the relative merits of different data types and provide a coherent framework for functional genomic datamining. Free availability over the Internet. f.c.p.holstege@med.uu.nl http://www.genomics.med.uu.nl/pub/pk/comb_gen_network.

  8. Application of high-throughput technologies to a structural proteomics-type analysis of Bacillus anthracis

    NARCIS (Netherlands)

    Au, K.; Folkers, G.E.; Kaptein, R.

    2006-01-01

    A collaborative project between two Structural Proteomics In Europe (SPINE) partner laboratories, York and Oxford, aimed at high-throughput (HTP) structure determination of proteins from Bacillus anthracis, the aetiological agent of anthrax and a biomedically important target, is described. Based

  9. Patient Throughput in a Sports Medicine Clinic With the Implementation of an Athletic Trainer: A Retrospective Analysis.

    Science.gov (United States)

    Nicolello, Timothy S; Pecha, Forrest Q; Omdal, Reed L; Nilsson, Kurt J; Homaechevarria, Alejandro A

    2016-10-31

    Orthopaedic clinics have acquired a multitude of health professionals to improve clinic efficiency. More recently, athletic trainers (ATs) have been utilized to improve clinical efficiency and patient care because of their extensive background in musculoskeletal injuries and anatomy. Improved clinical efficiency allows for increased patient visits, potentially enhancing patient access and downstream revenue via relative value units (RVUs). The addition of an AT into a sports medicine physician's clinic will increase total patient throughput and overall RVU production. Retrospective analysis. Level 4. Patients seen by each of the 2 primary care sports medicine physicians at St Luke's Sports Medicine for a 2-year period were retrospectively evaluated. The initial clinic model included the physician and a medical assistant; during the second year of analysis an AT was added to the clinic staffing model. Two-tailed t tests were used to determine significant differences in patient volume between the 2 periods of data collection. Through the implementation of an AT, patient throughput increased by 0.7 patients per hour over 2 half-day clinics, a 25% increase (P clinic day (P Clinical efficiency was improved with the addition of an AT. Total physician RVUs improved, thereby raising the potential revenue of both the physician and health care institution. Employing ATs in a sports medicine clinic may improve clinical productivity and financial stability, thereby validating the incorporation of ATs into the established clinical model. Limited research exists measuring patient throughput with an AT in a sports medicine clinic. This study investigates patient throughput and the subsequent increase in work-based RVUs. © 2016 The Author(s).

  10. Statistical analysis of liquid seepage in partially saturated heterogeneous fracture systems

    Energy Technology Data Exchange (ETDEWEB)

    Liou, Tai -Sheng [Univ. of California, Berkeley, CA (United States)

    1999-12-01

    Field evidence suggests that water flow in unsaturated fracture systems may occur along fast preferential flow paths. However, conventional macroscale continuum approaches generally predict the downward migration of water as a spatially uniform wetting front subjected to strong inhibition into the partially saturated rock matrix. One possible cause of this discrepancy may be the spatially random geometry of the fracture surfaces, and hence, the irregular fracture aperture. Therefore, a numerical model was developed in this study to investigate the effects of geometric features of natural rock fractures on liquid seepage and solute transport in 2-D planar fractures under isothermal, partially saturated conditions. The fractures were conceptualized as 2-D heterogeneous porous media that are characterized by their spatially correlated permeability fields. A statistical simulator, which uses a simulated annealing (SA) algorithm, was employed to generate synthetic permeability fields. Hypothesized geometric features that are expected to be relevant for seepage behavior, such as spatially correlated asperity contacts, were considered in the SA algorithm. Most importantly, a new perturbation mechanism for SA was developed in order to consider specifically the spatial correlation near conditioning asperity contacts. Numerical simulations of fluid flow and solute transport were then performed in these synthetic fractures by the flow simulator TOUGH2, assuming that the effects of matrix permeability, gas phase pressure, capillary/permeability hysteresis, and molecular diffusion can be neglected. Results of flow simulation showed that liquid seepage in partially saturated fractures is characterized by localized preferential flow, along with bypassing, funneling, and localized ponding. Seepage pattern is dominated by the fraction of asperity contracts, and their shape, size, and spatial correlation. However, the correlation structure of permeability field is less important

  11. Bifurcation analysis of a semiconductor laser with saturable absorber and delayed optical feedback

    CERN Document Server

    Terrien, Soizic; Broderick, Neil G R

    2016-01-01

    Semiconductor lasers exhibit a wealth of dynamics, from emission of a constant beam of light, to periodic oscillations and excitability. Self-pulsing regimes, where the laser periodically releases a short pulse of light, are particularly interesting for many applications, from material science to telecommunications. Self-pulsing regimes need to produce pulses very regularly and, as such, they are also known to be particularly sensitive to perturbations, such as noise or light injection. We investigate the effect of delayed optical feedback on the dynamics of a self-pulsing semiconductor laser with saturable absorber (SLSA). More precisely, we consider the Yamada model with delay -- a system of three delay-differential equations (DDEs) for two slow and one fast variable -- which has been shown to reproduce accurately self-pulsing features as observed in SLSA experimentally. This model is also of broader interest because it is quite closely related to mathematical models of other self-pulsing systems, such as e...

  12. Dual stator winding variable speed asynchronous generator: magnetic equivalent circuit with saturation, FEM analysis and experiments

    Science.gov (United States)

    Tutelea, L. N.; Muntean, N.; Deaconu, S. I.; Cunţan, C. D.

    2016-02-01

    The authors carried out a theoretical and experimental study of dual stator winding squirrel cage asynchronous generator (DSWA) behaviour in the presence of saturation regime (non-sinusoidal) due to the variable speed operation. The main aims are the determination of the relations of calculating the equivalent parameters of the machine windings, FEM validation of parameters and characteristics with free FEMM 4.2 computing software and the practice experimental tests for verifying them. Issue is limited to three phase range of double stator winding cage-asynchronous generator of small sized powers, the most currently used in the small adjustable speed wind or hydro power plants. The tests were carried out using three-phase asynchronous generator having rated power of 6 [kVA].

  13. Saturation of THz detection in InGaAs-based HEMTs: a numerical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahi, A. [Centre Universitaire Nour Bachir, B.P. 900, 32000 El Bayadh (Algeria); Palermo, C., E-mail: christophe.palermo@umontpellier.fr [University of Montpellier, IES, UMR 5214, 34000 Montpellier (France); CNRS, IES, UMR 5214, 34000 Montpellier (France); Marinchio, H. [University of Montpellier, IES, UMR 5214, 34000 Montpellier (France); CNRS, IES, UMR 5214, 34000 Montpellier (France); Belgachi, A. [University of Bechar, Bechar 08000 (Algeria); Varani, L. [University of Montpellier, IES, UMR 5214, 34000 Montpellier (France); CNRS, IES, UMR 5214, 34000 Montpellier (France)

    2016-11-01

    By numerical simulations, we investigate the large-signal photoresponse of InGaAs high electron mobility transistors submitted to THz radiations. The used pseudo-2D hydrodynamic model considers electron density and velocity conservations equations. A third equation is solved, in order to describe average energy conservation or to maintain it constantly equal to its thermal equilibrium value. In both cases, the calculated photoresponse increases with the incoming power density for its smallest values. For the higher values, a saturation of the photoresponse is observed, in agreement with experimental results, only when the energy conservation is accounted for. This allows to relate the limitation of the transistor detection features to electron heating phenomenon.

  14. A micromechanical analysis of damage propagation in fluid-saturated cracked media

    Science.gov (United States)

    Dormieux, Luc; Kondo, Djimedo; Ulm, Franz-Josef

    2006-07-01

    We first revisit the well known framework of Linear Elastic Fracture Mechanics (LEFM) in the case of a fluid-saturated crack. We next consider a r.e.v. of cracked medium comprising a family of cracks characterized by the corresponding crack density parameter ɛ. Generalizing the classical energy approach of LEFM, the proposed damage criterion is written on the thermodynamic force associated with ɛ, which is estimated by means of standard homogenization schemes. This criterion proves to involve a macroscopic effective strain tensor, or alternatively the Terzaghi effective stress tensor. The stability of damage propagation is discussed for various homogenization schemes. A comparison with experimental results is presented in the case of a uniaxial tensile test on concrete. To cite this article: L. Dormieux et al., C. R. Mecanique 334 (2006).

  15. Low-frequency asymptotic analysis of seismic reflection from afluid-saturated medium

    Energy Technology Data Exchange (ETDEWEB)

    Silin, D.B.; Korneev, V.A.; Goloshubin, G.M.; Patzek, T.W.

    2004-04-14

    Reflection of a seismic wave from a plane interface betweentwo elastic media does not depend on the frequency. If one of the mediais poroelastic and fluid-saturated, then the reflection becomesfrequency-dependent. This paper presents a low-frequency asymptoticformula for the reflection of seismic plane p-wave from a fluid-saturatedporous medium. The obtained asymptotic scaling of the frequency-dependentcomponent of the reflection coefficient shows that it is asymptoticallyproportional to the square root of the product of the reservoir fluidmobility and the frequency of the signal. The dependence of this scalingon the dynamic Darcy's law relaxation time is investigated as well.Derivation of the main equations of the theory of poroelasticity from thedynamic filtration theory reveals that this relaxation time isproportional to Biot's tortuosity parameter.

  16. Estimating saturated hydraulic conductivity and air permeability from soil physical properties using state-space analysis

    DEFF Research Database (Denmark)

    Poulsen, Tjalfe; Møldrup, Per; Nielsen, Don

    2003-01-01

    Estimates of soil hydraulic conductivity (K) and air permeability (k(a)) at given soil-water potentials are often used as reference points in constitutive models for K and k(a) as functions of moisture content and are, therefore, a prerequisite for predicting migration of water, air, and dissolved......) ARIMA (autoregressive integrated moving average) modeling, and (iii) State-space modeling. In addition to actual soil property values, ARIMA and state-space models account for effects of spatial correlation in soil properties. Measured data along two 70-m-long transects at a 20-year old constructed...... and gaseous chemicals in the vadose zone. In this study, three modeling approaches were used to identify the dependence of saturated hydraulic conductivity (K-S) and air permeability at -100 cm H2O soil-water potential (k(a100)) on soil physical properties in undisturbed soil: (i) Multiple regression, (ii...

  17. High-throughput quantitative analysis of domoic acid directly from mussel tissue using Laser Ablation Electrospray Ionization - tandem mass spectrometry.

    Science.gov (United States)

    Beach, Daniel G; Walsh, Callee M; McCarron, Pearse

    2014-12-15

    Eliminating sample extraction or liquid chromatography steps from methods for analysis of the neurotoxin Domoic Acid (DA) in shellfish could greatly increase throughput in food safety testing laboratories worldwide. To this end, we have investigated the use of Laser Ablation Electrospray Ionization (LAESI) with tandem mass spectrometry (MS/MS) detection for DA analysis directly from mussel tissue homogenates without sample extraction, cleanup or separation. DA could be selectively detected directly from mussel tissue homogenates using MS/MS in selected reaction monitoring scan mode. The quantitative capabilities of LAESI-MS/MS for DA analysis from mussel tissue were evaluated by analysis of four mussel tissue reference materials using matrix-matched calibration. Linear response was observed from 1 mg/kg to 40 mg/kg and the method limit of detection was 1 mg/kg. Results for DA analysis in tissue within the linear range were in good agreement with two established methods, LC-UV and LC-MS/MS (recoveries from 103 to 125%). Beyond the linear range, extraction and clean-up were required to achieve good quantitation. Most notable is the extremely rapid analysis time of about 10 s per sample by LAESI-MS/MS, which corresponds to a significant increase in sample throughput compared with existing methodology for routine DA analysis. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  18. Integrated Analysis Platform: An Open-Source Information System for High-Throughput Plant Phenotyping1[C][W][OPEN

    Science.gov (United States)

    Klukas, Christian; Chen, Dijun; Pape, Jean-Michel

    2014-01-01

    High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818

  19. Development of carbon plasma-coated multiwell plates for high-throughput mass spectrometric analysis of highly lipophilic fermentation products.

    Science.gov (United States)

    Heinig, Uwe; Scholz, Susanne; Dahm, Pia; Grabowy, Udo; Jennewein, Stefan

    2010-08-01

    Classical approaches to strain improvement and metabolic engineering rely on rapid qualitative and quantitative analyses of the metabolites of interest. As an analytical tool, mass spectrometry (MS) has proven to be efficient and nearly universally applicable for timely screening of metabolites. Furthermore, gas chromatography (GC)/MS- and liquid chromatography (LC)/MS-based metabolite screens can often be adapted to high-throughput formats. We recently engineered a Saccharomyces cerevisiae strain to produce taxa-4(5),11(12)-diene, the first pathway-committing biosynthetic intermediate for the anticancer drug Taxol, through the heterologous and homologous expression of several genes related to isoprenoid biosynthesis. To date, GC/MS- and LC/MS-based high-throughput methods have been inherently difficult to adapt to the screening of isoprenoid-producing microbial strains due to the need for extensive sample preparation of these often highly lipophilic compounds. In the current work, we examined different approaches to the high-throughput analysis of taxa-4(5),11(12)-diene biosynthesizing yeast strains in a 96-deep-well format. Carbon plasma coating of standard 96-deep-well polypropylene plates allowed us to circumvent the inherent solvent instability of commonly used deep-well plates. In addition, efficient adsorption of the target isoprenoid product by the coated plates allowed rapid and simple qualitative and quantitative analyses of the individual cultures. Copyright 2010 Elsevier Inc. All rights reserved.

  20. Sensitivity analysis of effective fluid and rock bulk modulus due to changes in pore pressure, temperature and saturation

    Science.gov (United States)

    Bhakta, Tuhin; Avseth, Per; Landrø, Martin

    2016-12-01

    Fluid substitution plays a vital role in time-lapse seismic modeling and interpretation. It is, therefore, very important to quantify as exactly as possible the changes in fluid bulk modulus due to changes in reservoir parameters. In this paper, we analyze the sensitivities in effective fluid bulk modulus due to changes in reservoir parameters like saturation, pore-pressure and temperature. The sensitivities are analyzed for two extreme bounds, i.e. the Voigt average and the Reuss average, for various fluid combinations (i.e. oil-water, gas-water and gas-oil). We quantify that the effects of pore-pressure and saturation changes are highest in the case of gas-water combination, while the effect of temperature is highest for oil-gas combination. Our results show that sensitivities vary with the bounds, even for same amount of changes in any reservoir parameter. In 4D rock physics studies, we often neglect the effects of pore-pressure or temperature changes assuming that those effects are negligible compare to the effect due to saturation change. Our analysis shows that pore-pressure and temperature changes can be vital and sometimes higher than the effect of saturation change. We investigate these effects on saturated rock bulk modulus. We first compute frame bulk modulus using the Modified Hashin Shtrikman (MHS) model for carbonate rocks and then perform fluid substitution using the Gassmann equation. We consider upper bound of the MHS as elastic behavior for stiffer rocks and lower bound of the MHS as elastic behavior for softer rocks. We then investigate four various combinations: stiff rock with upper bound (the Voigt bound) as effective fluid modulus, stiff rock with lower bound (Reuss bound) as effective fluid modulus, soft rock with upper bound as effective fluid modulus and soft rock with lower bound as effective fluid modulus. Our results show that the effect of any reservoir parameter change is highest for soft rock and lower bound combination and lowest

  1. A discrete time model for the analysis of medium-throughput C. elegans growth data.

    Directory of Open Access Journals (Sweden)

    Marjolein V Smith

    Full Text Available BACKGROUND: As part of a program to predict the toxicity of environmental agents on human health using alternative methods, several in vivo high- and medium-throughput assays are being developed that use C. elegans as a model organism. C. elegans-based toxicological assays utilize the COPAS Biosort flow sorting system that can rapidly measure size, extinction (EXT and time-of-flight (TOF, of individual nematodes. The use of this technology requires the development of mathematical and statistical tools to properly analyze the large volumes of biological data. METHODOLOGY/PRINCIPAL FINDINGS: Findings A Markov model was developed that predicts the growth of populations of C. elegans. The model was developed using observations from a 60 h growth study in which five cohorts of 300 nematodes each were aspirated and measured every 12 h. Frequency distributions of log(EXT measurements that were made when loading C. elegans L1 larvae into 96 well plates (t = 0 h were used by the model to predict the frequency distributions of the same set of nematodes when measured at 12 h intervals. The model prediction coincided well with the biological observations confirming the validity of the model. The model was also applied to log(TOF measurements following an adaptation. The adaptation accounted for variability in TOF measurements associated with potential curling or shortening of the nematodes as they passed through the flow cell of the Biosort. By providing accurate estimates of frequencies of EXT or TOF measurements following varying growth periods, the model was able to estimate growth rates. Best model fits showed that C. elegans did not grow at a constant exponential rate. Growth was best described with three different rates. Microscopic observations indicated that the points where the growth rates changed corresponded to specific developmental events: the L1/L2 molt and the start of oogenesis in young adult C. elegans. CONCLUSIONS: Quantitative analysis

  2. Rapid High-throughput Species Identification of Botanical Material Using Direct Analysis in Real Time High Resolution Mass Spectrometry.

    Science.gov (United States)

    Lesiak, Ashton D; Musah, Rabi A

    2016-10-02

    We demonstrate that direct analysis in real time-high resolution mass spectrometry can be used to produce mass spectral profiles of botanical material, and that these chemical fingerprints can be used for plant species identification. The mass spectral data can be acquired rapidly and in a high throughput manner without the need for sample extraction, derivatization or pH adjustment steps. The use of this technique bypasses challenges presented by more conventional techniques including lengthy chromatography analysis times and resource intensive methods. The high throughput capabilities of the direct analysis in real time-high resolution mass spectrometry protocol, coupled with multivariate statistical analysis processing of the data, provide not only class characterization of plants, but also yield species and varietal information. Here, the technique is demonstrated with two psychoactive plant products, Mitragyna speciosa (Kratom) and Datura (Jimsonweed), which were subjected to direct analysis in real time-high resolution mass spectrometry followed by statistical analysis processing of the mass spectral data. The application of these tools in tandem enabled the plant materials to be rapidly identified at the level of variety and species.

  3. Post-high-throughput screening analysis: an empirical compound prioritization scheme.

    Science.gov (United States)

    Oprea, Tudor I; Bologa, Cristian G; Edwards, Bruce S; Prossnitz, Eric R; Sklar, Larry A

    2005-08-01

    An empirical scheme to evaluate and prioritize screening hits from high-throughput screening (HTS) is proposed. Negative scores are given when chemotypes found in the HTS hits are present in annotated databases such as MDDR and WOMBAT or for testing positive in toxicity-related experiments reported in TOXNET. Positive scores were given for higher measured biological activities, for testing negative in toxicity-related literature, and for good overlap when profiled against drug-related properties. Particular emphasis is placed on estimating aqueous solubility to prioritize in vivo experiments. This empirical scheme is given as an illustration to assist the decision-making process in selecting chemotypes and individual compounds for further experimentation, when confronted with multiple hits from high-throughput experiments. The decision-making process is discussed for a set of G-protein coupled receptor antagonists and validated on a literature example for dihydrofolate reductase inhibition.

  4. HiCTMap: Detection and analysis of chromosome territory structure and position by high-throughput imaging.

    Science.gov (United States)

    Jowhar, Ziad; Gudla, Prabhakar R; Shachar, Sigal; Wangsa, Darawalee; Russ, Jill L; Pegoraro, Gianluca; Ried, Thomas; Raznahan, Armin; Misteli, Tom

    2018-02-10

    The spatial organization of chromosomes in the nuclear space is an extensively studied field that relies on measurements of structural features and 3D positions of chromosomes with high precision and robustness. However, no tools are currently available to image and analyze chromosome territories in a high-throughput format. Here, we have developed High-throughput Chromosome Territory Mapping (HiCTMap), a method for the robust and rapid analysis of 2D and 3D chromosome territory positioning in mammalian cells. HiCTMap is a high-throughput imaging-based chromosome detection method which enables routine analysis of chromosome structure and nuclear position. Using an optimized FISH staining protocol in a 384-well plate format in conjunction with a bespoke automated image analysis workflow, HiCTMap faithfully detects chromosome territories and their position in 2D and 3D in a large population of cells per experimental condition. We apply this novel technique to visualize chromosomes 18, X, and Y in male and female primary human skin fibroblasts, and show accurate detection of the correct number of chromosomes in the respective genotypes. Given the ability to visualize and quantitatively analyze large numbers of nuclei, we use HiCTMap to measure chromosome territory area and volume with high precision and determine the radial position of chromosome territories using either centroid or equidistant-shell analysis. The HiCTMap protocol is also compatible with RNA FISH as demonstrated by simultaneous labeling of X chromosomes and Xist RNA in female cells. We suggest HiCTMap will be a useful tool for routine precision mapping of chromosome territories in a wide range of cell types and tissues. Published by Elsevier Inc.

  5. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Craig A Gedye

    Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell

  6. Stress Analysis of CFG Pile Composite Foundation in Consolidating Saturated Mine Tailings Dam

    Directory of Open Access Journals (Sweden)

    Jinxing Lai

    2016-01-01

    Full Text Available Cement fly-ash gravel (CFG pile is a widely used ground reinforcement technique. This paper aims to address the mechanical characteristics of CFG composite foundation in consolidating saturated mine tailings (MTs dam. The field static load tests were employed to explore the bearing capacity of the CFG composite foundation, and finite element (FE models in three dimensions validated through comparison with experimental results were used to discuss the pile-soil stress distribution and pile-soil stress ratio of the CFG composite foundation. The results indicate that the distribution of earth pressure and pile stress is relatively homogeneous and stable over depth and load, while the development of CFG composite foundation bearing capacity is insufficient, in which the developed bearing capacity of CFG piles is less than 50% of its characteristic value. Additionally, compared with the laboratory model test results, the pile-soil stress ratio decreases with the increasing of the load in FEM results proved to better conform to the actual engineering conditions. Furthermore, the deformation modulus and thickness of cushion exert significant influence on pile-soil stress ratio and integral bearing capacity of CFG composite foundation.

  7. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  8. EVALUATION OF SILICA MONOLITHS IN AFFINITY MICROCOLUMNS FOR HIGH-THROUGHPUT ANALYSIS OF DRUG-PROTEIN INTERACTIONS

    OpenAIRE

    Yoo, Michelle J.; Hage, David S.

    2009-01-01

    Silica monoliths in affinity microcolumns were tested for the high-throughput analysis of drug-protein interactions. Human serum albumin (HSA) was used as a model protein for this work, while carbamazepine and R-warfarin were used as model analytes. A comparison of HSA silica monoliths of various lengths indicated columns as short as 1 to 3 mm could be used to provide reproducible estimates of retention factors or plate heights. Benefits of using smaller columns for this work included the low...

  9. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    Science.gov (United States)

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  10. Intelligent Approach for Analysis of Respiratory Signals and Oxygen Saturation in the Sleep Apnea/Hypopnea Syndrome

    Science.gov (United States)

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712

  11. Packet Throughput Analysis of Static and Dynamic TDD in Small Cell Networks

    OpenAIRE

    Yang, Howard H.; Geraci, Giovanni; Zhong, Yi; Quek, Tony Q. S.

    2017-01-01

    We develop an analytical framework for the perfor- mance comparison of small cell networks operating under static time division duplexing (S-TDD) and dynamic TDD (D-TDD). While in S-TDD downlink/uplink (DL/UL) cell transmissions are synchronized, in D-TDD each cell dynamically allocates resources to the most demanding direction. By leveraging stochastic geom- etry and queuing theory, we derive closed-form expressions for the UL and DL packet throughput, also capturing the impact of random tra...

  12. Worst-case Throughput Analysis for Parametric Rate and Parametric Actor Execution Time Scenario-Aware Dataflow Graphs

    Directory of Open Access Journals (Sweden)

    Mladen Skelin

    2014-03-01

    Full Text Available Scenario-aware dataflow (SADF is a prominent tool for modeling and analysis of dynamic embedded dataflow applications. In SADF the application is represented as a finite collection of synchronous dataflow (SDF graphs, each of which represents one possible application behaviour or scenario. A finite state machine (FSM specifies the possible orders of scenario occurrences. The SADF model renders the tightest possible performance guarantees, but is limited by its finiteness. This means that from a practical point of view, it can only handle dynamic dataflow applications that are characterized by a reasonably sized set of possible behaviours or scenarios. In this paper we remove this limitation for a class of SADF graphs by means of SADF model parametrization in terms of graph port rates and actor execution times. First, we formally define the semantics of the model relevant for throughput analysis based on (max,+ linear system theory and (max,+ automata. Second, by generalizing some of the existing results, we give the algorithms for worst-case throughput analysis of parametric rate and parametric actor execution time acyclic SADF graphs with a fully connected, possibly infinite state transition system. Third, we demonstrate our approach on a few realistic applications from digital signal processing (DSP domain mapped onto an embedded multi-processor architecture.

  13. High-throughput dental biofilm growth analysis for multiparametric microenvironmental biochemical conditions using microfluidics.

    Science.gov (United States)

    Lam, Raymond H W; Cui, Xin; Guo, Weijin; Thorsen, Todd

    2016-04-26

    Dental biofilm formation is not only a precursor to tooth decay, but also induces more serious systematic health problems such as cardiovascular disease and diabetes. Understanding the conditions promoting colonization and subsequent biofilm development involving complex bacteria coaggregation is particularly important. In this paper, we report a high-throughput microfluidic 'artificial teeth' device offering controls of multiple microenvironmental factors (e.g. nutrients, growth factors, dissolved gases, and seeded cell populations) for quantitative characteristics of long-term dental bacteria growth and biofilm development. This 'artificial teeth' device contains multiple (up to 128) incubation chambers to perform parallel cultivation and analyses (e.g. biofilm thickness, viable-dead cell ratio, and spatial distribution of multiple bacterial species) of bacteria samples under a matrix of different combinations of microenvironmental factors, further revealing possible developmental mechanisms of dental biofilms. Specifically, we applied the 'artificial teeth' to investigate the growth of two key dental bacteria, Streptococci species and Fusobacterium nucleatum, in the biofilm under different dissolved gas conditions and sucrose concentrations. Together, this high-throughput microfluidic platform can provide extended applications for general biofilm research, including screening of the biofilm properties developing under combinations of specified growth parameters such as seeding bacteria populations, growth medium compositions, medium flow rates and dissolved gas levels.

  14. Unraveling long non-coding RNAs through analysis of high-throughput RNA-sequencing data

    Directory of Open Access Journals (Sweden)

    Rashmi Tripathi

    2017-06-01

    Full Text Available Extensive genome-wide transcriptome study mediated by high throughput sequencing technique has revolutionized the study of genetics and epigenetic at unprecedented resolution. The research has revealed that besides protein-coding RNAs, large proportions of mammalian transcriptome includes a heap of regulatory non protein-coding RNAs, the number encoded within human genome is enigmatic. Many taboos developed in the past categorized these non-coding RNAs as ‘‘dark matter” and “junks”. Breaking the myth, RNA-seq-- a recently developed experimental technique is widely being used for studying non-coding RNAs which has acquired the limelight due to their physiological and pathological significance. The longest member of the ncRNA family-- long non-coding RNAs, acts as stable and functional part of a genome, guiding towards the important clues about the varied biological events like cellular-, structural- processes governing the complexity of an organism. Here, we review the most recent and influential computational approach developed to identify and quantify the long non-coding RNAs serving as an assistant for the users to choose appropriate tools for their specific research. Keywords: Transcriptome, High throughput sequencing, Genetic and epigenetic, Long non-coding RNA, RNA-sequencing, RNA-seq

  15. Mining the structural genomics pipeline: identification of protein properties that affect high-throughput experimental analysis.

    Science.gov (United States)

    Goh, Chern-Sing; Lan, Ning; Douglas, Shawn M; Wu, Baolin; Echols, Nathaniel; Smith, Andrew; Milburn, Duncan; Montelione, Gaetano T; Zhao, Hongyu; Gerstein, Mark

    2004-02-06

    Structural genomics projects represent major undertakings that will change our understanding of proteins. They generate unique datasets that, for the first time, present a standardized view of proteins in terms of their physical and chemical properties. By analyzing these datasets here, we are able to discover correlations between a protein's characteristics and its progress through each stage of the structural genomics pipeline, from cloning, expression, purification, and ultimately to structural determination. First, we use tree-based analyses (decision trees and random forest algorithms) to discover the most significant protein features that influence a protein's amenability to high-throughput experimentation. Based on this, we identify potential bottlenecks in various stages of the structural genomics process through specialized "pipeline schematics". We find that the properties of a protein that are most significant are: (i.) whether it is conserved across many organisms; (ii). the percentage composition of charged residues; (iii). the occurrence of hydrophobic patches; (iv). the number of binding partners it has; and (v). its length. Conversely, a number of other properties that might have been thought to be important, such as nuclear localization signals, are not significant. Thus, using our tree-based analyses, we are able to identify combinations of features that best differentiate the small group of proteins for which a structure has been determined from all the currently selected targets. This information may prove useful in optimizing high-throughput experimentation. Further information is available from http://mining.nesg.org/.

  16. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Science.gov (United States)

    Gaber, Rok; Majerle, Andreja; Jerala, Roman; Benčina, Mojca

    2013-01-01

    To effectively fight against the human immunodeficiency virus infection/acquired immunodeficiency syndrome (HIV/AIDS) epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET)-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity. PMID:24287545

  17. Theoretical analysis of quantum dot amplifiers with high saturation power and low noise figure

    DEFF Research Database (Denmark)

    Berg, Tommy Winther; Mørk, Jesper

    2002-01-01

    Semiconductor quantum dot amplifiers are predicted to exhibit superior characteristics such as high gain, and output power and low noise. The analysis provides criteria and design guidelines for the realization of high quality amplifiers.......Semiconductor quantum dot amplifiers are predicted to exhibit superior characteristics such as high gain, and output power and low noise. The analysis provides criteria and design guidelines for the realization of high quality amplifiers....

  18. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  19. Health effects of saturated and trans-fatty acid intake in children and adolescents: Systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Lisa Te Morenga

    Full Text Available Elevated cholesterol has been linked to cardiovascular disease in adults and preclinical markers of atherosclerosis in children, thus reducing saturated (SFA and trans-fatty acids (TFA intake from an early age may help to reduce cholesterol and the risk of cardiovascular disease later in life. The aim of this review is to examine the evidence for health effects associated with reducing SFA and TFA intake in free-living children, adolescents and young adults between 2 to 19 years of age.Systematic review and meta-analysis of randomised controlled trials (RCTs and prospective cohort studies. Study selection, assessment, validity, data extraction, and analysis were undertaken as specified by the Cochrane Collaboration and the GRADE working group. Data were pooled using inverse variance models with random effects.EMBASE; PubMed; Cochrane Central Register of Controlled Trials; LILACS; and WHO Clinical Trial Registry (up to July 2016.RCTs involving dietary interventions aiming to reduce SFA or TFA intakes and a control group, and cohort studies reporting the effects of SFA or TFA exposures, on outcomes including blood lipids; measures of growth; blood pressure; insulin resistance; and potential adverse effects. Minimum duration was 13 days for RCTs and one year for cohort studies. Trials of weight loss or confounded by additional medical or lifestyle interventions were excluded.Compared with control diets, there was a highly statistically significant effect of reduced SFA intake on total cholesterol (mean difference (MD -0.16 mmol/l, [95% confidence interval (CI: -0.25 to -0.07], LDL cholesterol (MD -0.13 mmol/l [95% CI:-0.22 to -0.03] and diastolic blood pressure (MD -1.45 mmol/l [95% CI:-2.34 to -0.56]. There were no significant effects on any other risk factors and no evidence of adverse effects.Advice to reduce saturated fatty acids intake of children results in a significant reduction in total and LDL-cholesterol levels as well as diastolic blood

  20. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  1. A method for high-throughput quantitative analysis of yeast chronological life span.

    Science.gov (United States)

    Murakami, Christopher J; Burtner, Christopher R; Kennedy, Brian K; Kaeberlein, Matt

    2008-02-01

    Chronological aging in yeast has been studied by maintaining cells in a quiescent-like stationary phase culture and monitoring cell survival over time. The composition of the growth medium can have a profound influence on chronological aging. For example, dietary restriction accomplished by lowering the glucose concentration of the medium significantly increases life span. Here we report a novel high-throughput method for measuring yeast chronological life span by monitoring outgrowth of aging cells using a Bioscreen C MBR machine. We show that this method provides survival data comparable to traditional methods, but with decreased variability. In addition to reducing the glucose concentration, we find that elevated amino acid levels or increased osmolarity of the growth medium is sufficient to increase chronological life span. We also report that life-span extension from dietary restriction does not require any of the five yeast sirtuins (Sir2, Hst1, Hst2, Hst3, or Hst4) either alone or in combination.

  2. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  3. High throughput holographic imaging-in-flow for the analysis of a wide plankton size range.

    Science.gov (United States)

    Yourassowsky, Catherine; Dubois, Frank

    2014-03-24

    We developed a Digital Holographic Microscope (DHM) working with a partial coherent source specifically adapted to perform high throughput recording of holograms of plankton organisms in-flow, in a size range of 3 µm-300 µm, which is of importance for this kind of applications. This wide size range is achieved with the same flow cell and with the same microscope magnification. The DHM configuration combines a high magnification with a large field of view and provides high-resolution intensity and quantitative phase images refocusing on high sample flow rate. Specific algorithms were developed to detect and extract automatically the particles and organisms present in the samples in order to build holograms of each one that are used for holographic refocusing and quantitative phase contrast imaging. Experimental results are shown and discussed.

  4. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    Science.gov (United States)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  5. High-throughput quantitative biochemical characterization of algal biomass by NIR spectroscopy; multiple linear regression and multivariate linear regression analysis.

    Science.gov (United States)

    Laurens, L M L; Wolfrum, E J

    2013-12-18

    One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.

  6. High-throughput sequencing and graph-based cluster analysis facilitate microsatellite development from a highly complex genome.

    Science.gov (United States)

    Shah, Abhijeet B; Schielzeth, Holger; Albersmeier, Andreas; Kalinowski, Joern; Hoffman, Joseph I

    2016-08-01

    Despite recent advances in high-throughput sequencing, difficulties are often encountered when developing microsatellites for species with large and complex genomes. This probably reflects the close association in many species of microsatellites with cryptic repetitive elements. We therefore developed a novel approach for isolating polymorphic microsatellites from the club-legged grasshopper (Gomphocerus sibiricus), an emerging quantitative genetic and behavioral model system. Whole genome shotgun Illumina MiSeq sequencing was used to generate over three million 300 bp paired-end reads, of which 67.75% were grouped into 40,548 clusters within RepeatExplorer. Annotations of the top 468 clusters, which represent 60.5% of the reads, revealed homology to satellite DNA and a variety of transposable elements. Evaluating 96 primer pairs in eight wild-caught individuals, we found that primers mined from singleton reads were six times more likely to amplify a single polymorphic microsatellite locus than primers mined from clusters. Our study provides experimental evidence in support of the notion that microsatellites associated with repetitive elements are less likely to successfully amplify. It also reveals how advances in high-throughput sequencing and graph-based repetitive DNA analysis can be leveraged to isolate polymorphic microsatellites from complex genomes.

  7. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs

    Directory of Open Access Journals (Sweden)

    Schweizer Patrick

    2008-01-01

    Full Text Available Abstract Background To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. Results We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Conclusion Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  8. Misconceptions in Reporting Oxygen Saturation

    NARCIS (Netherlands)

    Toffaletti, John; Zijlstra, Willem G.

    2007-01-01

    BACKGROUND: We describe some misconceptions that have become common practice in reporting blood gas and cooximetry results. In 1980, oxygen saturation was incorrectly redefined in a report of a new instrument for analysis of hemoglobin (Hb) derivatives. Oxygen saturation (sO(2)) was redefined as the

  9. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging.

    Science.gov (United States)

    Pandey, Piyush; Ge, Yufeng; Stoerger, Vincent; Schnable, James C

    2017-01-01

    Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N), phosphorus (P), potassium (K), magnesium (Mg), calcium (Ca), and sulfur (S), and micronutrients sodium (Na), iron (Fe), manganese (Mn), boron (B), copper (Cu), and zinc (Zn). Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R(2) = 0.93 and RPD (Ratio of Performance to Deviation) = 3.8]. All macronutrients were also quantified satisfactorily (R(2) from 0.69 to 0.92, RPD from 1.62 to 3.62), with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R(2) from 0.19 to 0.86, RPD from 1.09 to 2.69) than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R(2) chemical traits. Future research is

  10. Gene Expression Analysis of Escherichia Coli Grown in Miniaturized Bioreactor Platforms for High-Throughput Analysis of Growth and genomic Data

    DEFF Research Database (Denmark)

    Boccazzi, P.; Zanzotto, A.; Szita, Nicolas

    2005-01-01

    Combining high-throughput growth physiology and global gene expression data analysis is of significant value for integrating metabolism and genomics. We compared global gene expression using 500 ng of total RNA from Escherichia coli cultures grown in rich or defined minimal media in a miniaturized....... In general, these changes in gene expression levels were similar to those observed in 1,000-fold larger cultures. The increasing rate at which complete genomic sequences of microorganisms are becoming available offers an unprecedented opportunity for investigating these organisms. Our results from microscale...... cultures using just 500 ng of total RNA indicate that high-throughput integration of growth physiology and genomics will be possible with novel biochemical platforms and improved detection technologies....

  11. Analysis of JC virus DNA replication using a quantitative and high-throughput assay.

    Science.gov (United States)

    Shin, Jong; Phelan, Paul J; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A

    2014-11-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. High-throughput nucleotide sequence analysis of diverse bacterial communities in leachates of decomposing pig carcasses

    Directory of Open Access Journals (Sweden)

    Seung Hak Yang

    2015-09-01

    Full Text Available The leachate generated by the decomposition of animal carcass has been implicated as an environmental contaminant surrounding the burial site. High-throughput nucleotide sequencing was conducted to investigate the bacterial communities in leachates from the decomposition of pig carcasses. We acquired 51,230 reads from six different samples (1, 2, 3, 4, 6 and 14 week-old carcasses and found that sequences representing the phylum Firmicutes predominated. The diversity of bacterial 16S rRNA gene sequences in the leachate was the highest at 6 weeks, in contrast to those at 2 and 14 weeks. The relative abundance of Firmicutes was reduced, while the proportion of Bacteroidetes and Proteobacteria increased from 3–6 weeks. The representation of phyla was restored after 14 weeks. However, the community structures between the samples taken at 1–2 and 14 weeks differed at the bacterial classification level. The trend in pH was similar to the changes seen in bacterial communities, indicating that the pH of the leachate could be related to the shift in the microbial community. The results indicate that the composition of bacterial communities in leachates of decomposing pig carcasses shifted continuously during the study period and might be influenced by the burial site.

  13. Perchlorate reduction by hydrogen autotrophic bacteria and microbial community analysis using high-throughput sequencing.

    Science.gov (United States)

    Wan, Dongjin; Liu, Yongde; Niu, Zhenhua; Xiao, Shuhu; Li, Daorong

    2016-02-01

    Hydrogen autotrophic reduction of perchlorate have advantages of high removal efficiency and harmless to drinking water. But so far the reported information about the microbial community structure was comparatively limited, changes in the biodiversity and the dominant bacteria during acclimation process required detailed study. In this study, perchlorate-reducing hydrogen autotrophic bacteria were acclimated by hydrogen aeration from activated sludge. For the first time, high-throughput sequencing was applied to analyze changes in biodiversity and the dominant bacteria during acclimation process. The Michaelis-Menten model described the perchlorate reduction kinetics well. Model parameters q(max) and K(s) were 2.521-3.245 (mg ClO4(-)/gVSS h) and 5.44-8.23 (mg/l), respectively. Microbial perchlorate reduction occurred across at pH range 5.0-11.0; removal was highest at pH 9.0. The enriched mixed bacteria could use perchlorate, nitrate and sulfate as electron accepter, and the sequence of preference was: NO3(-) > ClO4(-) > SO4(2-). Compared to the feed culture, biodiversity decreased greatly during acclimation process, the microbial community structure gradually stabilized after 9 acclimation cycles. The Thauera genus related to Rhodocyclales was the dominated perchlorate reducing bacteria (PRB) in the mixed culture.

  14. Automating proteome analysis: improvements in throughput, quality and accuracy of protein identification by peptide mass fingerprinting.

    Science.gov (United States)

    Canelle, Ludovic; Pionneau, Cédric; Marie, Arul; Bousquet, Jordane; Bigeard, Jean; Lutomski, Didier; Kadri, Tewfik; Caron, Michel; Joubert-Caron, Raymonde

    2004-01-01

    The use of robots has major effects on maximizing the proteomic workflow required in an increasing number of high-throughput projects and on increasing the quality of the data. In peptide mass finger printing (PMF), automation of steps downstream of two-dimensional gel electrophoresis is essential. To achieve this goal, the workflow must be fluid. We have developed tools using macros written in Microsoft Excel and Word to complete the automation of our platform. Additionally, because sample preparation is crucial for identification of proteins by matrix-assisted laser desorption/ionization (MALDI) mass spectrometry, we optimized a sandwich method usable by any robot for spotting digests on a MALDI target. This procedure enables further efficient automated washing steps directly on the MALDI target. The success rate of PMF identification was evaluated for the automated sandwich method, and for the dried-droplet method implemented on the robot as recommended by the manufacturer. Of the two methods, the sandwich method achieved the highest identification success rate and sequence coverage of proteins. 2004 John Wiley & Sons, Ltd.

  15. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Science.gov (United States)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2015-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. PMID:25155200

  16. FLIC: high-throughput, continuous analysis of feeding behaviors in Drosophila.

    Directory of Open Access Journals (Sweden)

    Jennifer Ro

    Full Text Available We present a complete hardware and software system for collecting and quantifying continuous measures of feeding behaviors in the fruit fly, Drosophila melanogaster. The FLIC (Fly Liquid-Food Interaction Counter detects analog electronic signals as brief as 50 µs that occur when a fly makes physical contact with liquid food. Signal characteristics effectively distinguish between different types of behaviors, such as feeding and tasting events. The FLIC system performs as well or better than popular methods for simple assays, and it provides an unprecedented opportunity to study novel components of feeding behavior, such as time-dependent changes in food preference and individual levels of motivation and hunger. Furthermore, FLIC experiments can persist indefinitely without disturbance, and we highlight this ability by establishing a detailed picture of circadian feeding behaviors in the fly. We believe that the FLIC system will work hand-in-hand with modern molecular techniques to facilitate mechanistic studies of feeding behaviors in Drosophila using modern, high-throughput technologies.

  17. Next generation MUT-MAP, a high-sensitivity high-throughput microfluidics chip-based mutation analysis panel.

    Directory of Open Access Journals (Sweden)

    Erica B Schleifman

    Full Text Available Molecular profiling of tumor tissue to detect alterations, such as oncogenic mutations, plays a vital role in determining treatment options in oncology. Hence, there is an increasing need for a robust and high-throughput technology to detect oncogenic hotspot mutations. Although commercial assays are available to detect genetic alterations in single genes, only a limited amount of tissue is often available from patients, requiring multiplexing to allow for simultaneous detection of mutations in many genes using low DNA input. Even though next-generation sequencing (NGS platforms provide powerful tools for this purpose, they face challenges such as high cost, large DNA input requirement, complex data analysis, and long turnaround times, limiting their use in clinical settings. We report the development of the next generation mutation multi-analyte panel (MUT-MAP, a high-throughput microfluidic, panel for detecting 120 somatic mutations across eleven genes of therapeutic interest (AKT1, BRAF, EGFR, FGFR3, FLT3, HRAS, KIT, KRAS, MET, NRAS, and PIK3CA using allele-specific PCR (AS-PCR and Taqman technology. This mutation panel requires as little as 2 ng of high quality DNA from fresh frozen or 100 ng of DNA from formalin-fixed paraffin-embedded (FFPE tissues. Mutation calls, including an automated data analysis process, have been implemented to run 88 samples per day. Validation of this platform using plasmids showed robust signal and low cross-reactivity in all of the newly added assays and mutation calls in cell line samples were found to be consistent with the Catalogue of Somatic Mutations in Cancer (COSMIC database allowing for direct comparison of our platform to Sanger sequencing. High correlation with NGS when compared to the SuraSeq500 panel run on the Ion Torrent platform in a FFPE dilution experiment showed assay sensitivity down to 0.45%. This multiplexed mutation panel is a valuable tool for high-throughput biomarker discovery in

  18. Genomic saturation mutagenesis and polygenic analysis identify novel yeast genes affecting ethyl acetate production, a non-selectable polygenic trait

    Science.gov (United States)

    Abt, Tom Den; Souffriau, Ben; Foulquié-Moreno, Maria R.; Duitama, Jorge; Thevelein, Johan M.

    2016-01-01

    Isolation of mutants in populations of microorganisms has been a valuable tool in experimental genetics for decades. The main disadvantage, however, is the inability of isolating mutants in non-selectable polygenic traits. Most traits of organisms, however, are non-selectable and polygenic, including industrially important properties of microorganisms. The advent of powerful technologies for polygenic analysis of complex traits has allowed simultaneous identification of multiple causative mutations among many thousands of irrelevant mutations. We now show that this also applies to haploid strains of which the genome has been loaded with induced mutations so as to affect as many non-selectable, polygenic traits as possible. We have introduced about 900 mutations into single haploid yeast strains using multiple rounds of EMS mutagenesis, while maintaining the mating capacity required for genetic mapping. We screened the strains for defects in flavor production, an important non-selectable, polygenic trait in yeast alcoholic beverage production. A haploid strain with multiple induced mutations showing reduced ethyl acetate production in semi-anaerobic fermentation, was selected and the underlying quantitative trait loci (QTLs) were mapped using pooled-segregant whole-genome sequence analysis after crossing with an unrelated haploid strain. Reciprocal hemizygosity analysis and allele exchange identified PMA1 and CEM1 as causative mutant alleles and TPS1 as a causative genetic background allele. The case of CEM1 revealed that relevant mutations without observable effect in the haploid strain with multiple induced mutations (in this case due to defective mitochondria) can be identified by polygenic analysis as long as the mutations have an effect in part of the segregants (in this case those that regained fully functional mitochondria). Our results show that genomic saturation mutagenesis combined with complex trait polygenic analysis could be used successfully to

  19. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP) experiments.

    Science.gov (United States)

    Youngblut, Nicholas D; Barnett, Samuel E; Buckley, Daniel H

    2018-01-01

    Combining high throughput sequencing with stable isotope probing (HTS-SIP) is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP), multi-window high-resolution stable isotope probing (MW-HR-SIP), quantitative stable isotope probing (qSIP), and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  20. Pathway Processor 2.0: a web resource for pathway-based analysis of high-throughput data.

    Science.gov (United States)

    Beltrame, Luca; Bianco, Luca; Fontana, Paolo; Cavalieri, Duccio

    2013-07-15

    Pathway Processor 2.0 is a web application designed to analyze high-throughput datasets, including but not limited to microarray and next-generation sequencing, using a pathway centric logic. In addition to well-established methods such as the Fisher's test and impact analysis, Pathway Processor 2.0 offers innovative methods that convert gene expression into pathway expression, leading to the identification of differentially regulated pathways in a dataset of choice. Pathway Processor 2.0 is available as a web service at http://compbiotoolbox.fmach.it/pathwayProcessor/. Sample datasets to test the functionality can be used directly from the application. duccio.cavalieri@fmach.it Supplementary data are available at Bioinformatics online.

  1. High-throughput analysis of lipid hydroperoxides in edible oils and fats using the fluorescent reagent diphenyl-1-pyrenylphosphine.

    Science.gov (United States)

    Santas, Jonathan; Guzmán, Yeimmy J; Guardiola, Francesc; Rafecas, Magdalena; Bou, Ricard

    2014-11-01

    A fluorometric method for the determination of hydroperoxides (HP) in edible oils and fats using the reagent diphenyl-1-pyrenylphosphine (DPPP) was developed and validated. Two solvent media containing 100% butanol or a mixture of chloroform/methanol (2:1, v/v) can be used to solubilise lipid samples. Regardless of the solvent used to solubilise the sample, the DPPP method was precise, accurate, sensitive and easy to perform. The HP content of 43 oil and fat samples was determined and the results were compared with those obtained by means of the AOCS Official Method for the determination of peroxide value (PV) and the ferrous oxidation-xylenol orange (FOX) method. The proposed method not only correlates well with the PV and FOX methods, but also presents some advantages such as requiring low sample and solvent amounts and being suitable for high-throughput sample analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. ImagePlane: an automated image analysis pipeline for high-throughput screens using the planarian Schmidtea mediterranea.

    Science.gov (United States)

    Flygare, Steven; Campbell, Michael; Ross, Robert Mars; Moore, Barry; Yandell, Mark

    2013-08-01

    ImagePlane is a modular pipeline for automated, high-throughput image analysis and information extraction. Designed to support planarian research, ImagePlane offers a self-parameterizing adaptive thresholding algorithm; an algorithm that can automatically segment animals into anterior-posterior/left-right quadrants for automated identification of region-specific differences in gene and protein expression; and a novel algorithm for quantification of morphology of animals, independent of their orientations and sizes. ImagePlane also provides methods for automatic report generation, and its outputs can be easily imported into third-party tools such as R and Excel. Here we demonstrate the pipeline's utility for identification of genes involved in stem cell proliferation in the planarian Schmidtea mediterranea. Although designed to support planarian studies, ImagePlane will prove useful for cell-based studies as well.

  3. High-throughput analysis of sub-visible mAb aggregate particles using automated fluorescence microscopy imaging.

    Science.gov (United States)

    Paul, Albert Jesuran; Bickel, Fabian; Röhm, Martina; Hospach, Lisa; Halder, Bettina; Rettich, Nina; Handrick, René; Herold, Eva Maria; Kiefer, Hans; Hesse, Friedemann

    2017-07-01

    Aggregation of therapeutic proteins is a major concern as aggregates lower the yield and can impact the efficacy of the drug as well as the patient's safety. It can occur in all production stages; thus, it is essential to perform a detailed analysis for protein aggregates. Several methods such as size exclusion high-performance liquid chromatography (SE-HPLC), light scattering, turbidity, light obscuration, and microscopy-based approaches are used to analyze aggregates. None of these methods allows determination of all types of higher molecular weight (HMW) species due to a limited size range. Furthermore, quantification and specification of different HMW species are often not possible. Moreover, automation is a perspective challenge coming up with automated robotic laboratory systems. Hence, there is a need for a fast, high-throughput-compatible method, which can detect a broad size range and enable quantification and classification. We describe a novel approach for the detection of aggregates in the size range 1 to 1000 μm combining fluorescent dyes for protein aggregate labelling and automated fluorescence microscope imaging (aFMI). After appropriate selection of the dye and method optimization, our method enabled us to detect various types of HMW species of monoclonal antibodies (mAbs). Using 10 μmol L(-1) 4,4'-dianilino-1,1'-binaphthyl-5,5'-disulfonate (Bis-ANS) in combination with aFMI allowed the analysis of mAb aggregates induced by different stresses occurring during downstream processing, storage, and administration. Validation of our results was performed by SE-HPLC, UV-Vis spectroscopy, and dynamic light scattering. With this new approach, we could not only reliably detect different HMW species but also quantify and classify them in an automated approach. Our method achieves high-throughput requirements and the selection of various fluorescent dyes enables a broad range of applications.

  4. Strategies for high-throughput comparative modeling: applications to leverage analysis in structural genomics and protein family organization.

    Science.gov (United States)

    Mirkovic, Nebojsa; Li, Zhaohui; Parnassa, Andrew; Murray, Diana

    2007-03-01

    The technological breakthroughs in structural genomics were designed to facilitate the solution of a sufficient number of structures, so that as many protein sequences as possible can be structurally characterized with the aid of comparative modeling. The leverage of a solved structure is the number and quality of the models that can be produced using the structure as a template for modeling and may be viewed as the "currency" with which the success of a structural genomics endeavor can be measured. Moreover, the models obtained in this way should be valuable to all biologists. To this end, at the Northeast Structural Genomics Consortium (NESG), a modular computational pipeline for automated high-throughput leverage analysis was devised and used to assess the leverage of the 186 unique NESG structures solved during the first phase of the Protein Structure Initiative (January 2000 to July 2005). Here, the results of this analysis are presented. The number of sequences in the nonredundant protein sequence database covered by quality models produced by the pipeline is approximately 39,000, so that the average leverage is approximately 210 models per structure. Interestingly, only 7900 of these models fulfill the stringent modeling criterion of being at least 30% sequence-identical to the corresponding NESG structures. This study shows how high-throughput modeling increases the efficiency of structure determination efforts by providing enhanced coverage of protein structure space. In addition, the approach is useful in refining the boundaries of structural domains within larger protein sequences, subclassifying sequence diverse protein families, and defining structure-based strategies specific to a particular family. (c) 2006 Wiley-Liss, Inc.

  5. Facts about saturated fats

    Science.gov (United States)

    ... fat diary with low-fat or nonfat milk, yogurt, and cheese. Eat more fruits, vegetables, whole grains, and other foods with low or no saturated fat. Alternative Names Cholesterol - saturated fat; Atherosclerosis - saturated fat; Hardening of the ...

  6. Saturated fat (image)

    Science.gov (United States)

    Saturated fat can raise blood cholesterol and can put you at risk for heart disease and stroke. You should ... limit any foods that are high in saturated fat. Sources of saturated fat include whole-milk dairy ...

  7. Construction and analysis of an integrated regulatory network derived from high-throughput sequencing data.

    Directory of Open Access Journals (Sweden)

    Chao Cheng

    2011-11-01

    Full Text Available We present a network framework for analyzing multi-level regulation in higher eukaryotes based on systematic integration of various high-throughput datasets. The network, namely the integrated regulatory network, consists of three major types of regulation: TF→gene, TF→miRNA and miRNA→gene. We identified the target genes and target miRNAs for a set of TFs based on the ChIP-Seq binding profiles, the predicted targets of miRNAs using annotated 3'UTR sequences and conservation information. Making use of the system-wide RNA-Seq profiles, we classified transcription factors into positive and negative regulators and assigned a sign for each regulatory interaction. Other types of edges such as protein-protein interactions and potential intra-regulations between miRNAs based on the embedding of miRNAs in their host genes were further incorporated. We examined the topological structures of the network, including its hierarchical organization and motif enrichment. We found that transcription factors downstream of the hierarchy distinguish themselves by expressing more uniformly at various tissues, have more interacting partners, and are more likely to be essential. We found an over-representation of notable network motifs, including a FFL in which a miRNA cost-effectively shuts down a transcription factor and its target. We used data of C. elegans from the modENCODE project as a primary model to illustrate our framework, but further verified the results using other two data sets. As more and more genome-wide ChIP-Seq and RNA-Seq data becomes available in the near future, our methods of data integration have various potential applications.

  8. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  9. High-throughput mutational analysis of TOR1A in primary dystonia.

    Science.gov (United States)

    Xiao, Jianfeng; Bastian, Robert W; Perlmutter, Joel S; Racette, Brad A; Tabbal, Samer D; Karimi, Morvarid; Paniello, Randal C; Blitzer, Andrew; Batish, Sat Dev; Wszolek, Zbigniew K; Uitti, Ryan J; Hedera, Peter; Simon, David K; Tarsy, Daniel; Truong, Daniel D; Frei, Karen P; Pfeiffer, Ronald F; Gong, Suzhen; Zhao, Yu; LeDoux, Mark S

    2009-03-11

    Although the c.904_906delGAG mutation in Exon 5 of TOR1A typically manifests as early-onset generalized dystonia, DYT1 dystonia is genetically and clinically heterogeneous. Recently, another Exon 5 mutation (c.863G>A) has been associated with early-onset generalized dystonia and some DeltaGAG mutation carriers present with late-onset focal dystonia. The aim of this study was to identify TOR1A Exon 5 mutations in a large cohort of subjects with mainly non-generalized primary dystonia. High resolution melting (HRM) was used to examine the entire TOR1A Exon 5 coding sequence in 1014 subjects with primary dystonia (422 spasmodic dysphonia, 285 cervical dystonia, 67 blepharospasm, 41 writer's cramp, 16 oromandibular dystonia, 38 other primary focal dystonia, 112 segmental dystonia, 16 multifocal dystonia, and 17 generalized dystonia) and 250 controls (150 neurologically normal and 100 with other movement disorders). Diagnostic sensitivity and specificity were evaluated in an additional 8 subjects with known DeltaGAG DYT1 dystonia and 88 subjects with DeltaGAG-negative dystonia. HRM of TOR1A Exon 5 showed high (100%) diagnostic sensitivity and specificity. HRM was rapid and economical. HRM reliably differentiated the TOR1A DeltaGAG and c.863G>A mutations. Melting curves were normal in 250/250 controls and 1012/1014 subjects with primary dystonia. The two subjects with shifted melting curves were found to harbor the classic DeltaGAG deletion: 1) a non-Jewish Caucasian female with childhood-onset multifocal dystonia and 2) an Ashkenazi Jewish female with adolescent-onset spasmodic dysphonia. First, HRM is an inexpensive, diagnostically sensitive and specific, high-throughput method for mutation discovery. Second, Exon 5 mutations in TOR1A are rarely associated with non-generalized primary dystonia.

  10. High-throughput mutational analysis of TOR1A in primary dystonia

    Directory of Open Access Journals (Sweden)

    Truong Daniel D

    2009-03-01

    Full Text Available Abstract Background Although the c.904_906delGAG mutation in Exon 5 of TOR1A typically manifests as early-onset generalized dystonia, DYT1 dystonia is genetically and clinically heterogeneous. Recently, another Exon 5 mutation (c.863G>A has been associated with early-onset generalized dystonia and some ΔGAG mutation carriers present with late-onset focal dystonia. The aim of this study was to identify TOR1A Exon 5 mutations in a large cohort of subjects with mainly non-generalized primary dystonia. Methods High resolution melting (HRM was used to examine the entire TOR1A Exon 5 coding sequence in 1014 subjects with primary dystonia (422 spasmodic dysphonia, 285 cervical dystonia, 67 blepharospasm, 41 writer's cramp, 16 oromandibular dystonia, 38 other primary focal dystonia, 112 segmental dystonia, 16 multifocal dystonia, and 17 generalized dystonia and 250 controls (150 neurologically normal and 100 with other movement disorders. Diagnostic sensitivity and specificity were evaluated in an additional 8 subjects with known ΔGAG DYT1 dystonia and 88 subjects with ΔGAG-negative dystonia. Results HRM of TOR1A Exon 5 showed high (100% diagnostic sensitivity and specificity. HRM was rapid and economical. HRM reliably differentiated the TOR1A ΔGAG and c.863G>A mutations. Melting curves were normal in 250/250 controls and 1012/1014 subjects with primary dystonia. The two subjects with shifted melting curves were found to harbor the classic ΔGAG deletion: 1 a non-Jewish Caucasian female with childhood-onset multifocal dystonia and 2 an Ashkenazi Jewish female with adolescent-onset spasmodic dysphonia. Conclusion First, HRM is an inexpensive, diagnostically sensitive and specific, high-throughput method for mutation discovery. Second, Exon 5 mutations in TOR1A are rarely associated with non-generalized primary dystonia.

  11. Transcriptomic analysis of Petunia hybrida in response to salt stress using high throughput RNA sequencing.

    Directory of Open Access Journals (Sweden)

    Gonzalo H Villarino

    Full Text Available Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments.

  12. High-throughput transcriptome analysis of barley (Hordeum vulgare) exposed to excessive boron.

    Science.gov (United States)

    Tombuloglu, Guzin; Tombuloglu, Huseyin; Sakcali, M Serdal; Unver, Turgay

    2015-02-15

    Boron (B) is an essential micronutrient for optimum plant growth. However, above certain threshold B is toxic and causes yield loss in agricultural lands. While a number of studies were conducted to understand B tolerance mechanism, a transcriptome-wide approach for B tolerant barley is performed here for the first time. A high-throughput RNA-Seq (cDNA) sequencing technology (Illumina) was used with barley (Hordeum vulgare), yielding 208 million clean reads. In total, 256,874 unigenes were generated and assigned to known peptide databases: Gene Ontology (GO) (99,043), Swiss-Prot (38,266), Clusters of Orthologous Groups (COG) (26,250), and the Kyoto Encyclopedia of Genes and Genomes (KEGG) (36,860), as determined by BLASTx search. According to the digital gene expression (DGE) analyses, 16% and 17% of the transcripts were found to be differentially regulated in root and leaf tissues, respectively. Most of them were involved in cell wall, stress response, membrane, protein kinase and transporter mechanisms. Some of the genes detected as highly expressed in root tissue are phospholipases, predicted divalent heavy-metal cation transporters, formin-like proteins and calmodulin/Ca(2+)-binding proteins. In addition, chitin-binding lectin precursor, ubiquitin carboxyl-terminal hydrolase, and serine/threonine-protein kinase AFC2 genes were indicated to be highly regulated in leaf tissue upon excess B treatment. Some pathways, such as the Ca(2+)-calmodulin system, are activated in response to B toxicity. The differential regulation of 10 transcripts was confirmed by qRT-PCR, revealing the tissue-specific responses against B toxicity and their putative function in B-tolerance mechanisms. Copyright © 2014. Published by Elsevier B.V.

  13. Rapid screening of classic galactosemia patients: a proof-of-concept study using high-throughput FTIR analysis of plasma.

    Science.gov (United States)

    Lacombe, Caroline; Untereiner, Valérie; Gobinet, Cyril; Zater, Mokhtar; Sockalingum, Ganesh D; Garnotel, Roselyne

    2015-04-07

    Classic galactosemia is an autosomal recessive metabolic disease involving the galactose pathway, caused by the deficiency of galactose-1-phosphate uridyltransferase. Galactose accumulation induces in newborns many symptoms, such as liver disease, cataracts, and sepsis leading to death if untreated. Neonatal screening is developed and applied in many countries using several methods to detect galactose or its derived product accumulation in blood or urine. High-throughput FTIR spectroscopy was investigated as a potential tool in the current screening methods. IR spectra were obtained from blood plasma of healthy, diabetic, and galactosemic patients. The major spectral differences were in the carbohydrate region, which was first analysed in an exploratory manner using principal component analysis (PCA). PCA score plots showed a clear discrimination between diabetic and galactosemic patients and this was more marked as a function of the glucose and galactose increased concentration in these patients' plasma respectively. Then, a support vector machine leave-one-out cross-validation (SVM-LOOCV) classifier was built with the PCA scores as the input and the model was tested on median, mean and all spectra from the three population groups. This classifier was able to discriminate healthy/diabetic, healthy/galactosemic, and diabetic/galactosemic patients with sensitivity and specificity rates ranging from 80% to 94%. The total accuracy rate ranged from 87% to 96%. High-throughput FTIR spectroscopy combined with the SVM-LOOCV classification procedure appears to be a promising tool in the screening of galactosemia patients, with good sensitivity and specificity. Furthermore, this approach presents the advantages of being cost-effective, fast, and straightforward in the screening of galactosemic patients.

  14. A functional analysis of the CREB signaling pathway using HaloCHIP-chip and high throughput reporter assays

    Directory of Open Access Journals (Sweden)

    Aldred Shelley F

    2009-10-01

    Full Text Available Abstract Background Regulation of gene expression is essential for normal development and cellular growth. Transcriptional events are tightly controlled both spatially and temporally by specific DNA-protein interactions. In this study we finely map the genome-wide targets of the CREB protein across all known and predicted human promoters, and characterize the functional consequences of a subset of these binding events using high-throughput reporter assays. To measure CREB binding, we used HaloCHIP, an antibody-free alternative to the ChIP method that utilizes the HaloTag fusion protein, and also high-throughput promoter-luciferase reporter assays, which provide rapid and quantitative screening of promoters for transcriptional activation or repression in living cells. Results In analysis of CREB genome-wide binding events using a comprehensive DNA microarray of human promoters, we observe for the first time that CREB has a strong preference for binding at bidirectional promoters and unlike unidirectional promoters, these binding events often occur downstream of transcription start sites. Comparison between HaloCHIP-chip and ChIP-chip data reveal this to be true for both methodologies, indicating it is not a bias of the technology chosen. Transcriptional data obtained from promoter-luciferase reporter arrays also show an unprecedented, high level of activation of CREB-bound promoters in the presence of the co-activator protein TORC1. Conclusion These data suggest for the first time that TORC1 provides directional information when CREB is bound at bidirectional promoters and possible pausing of the CREB protein after initial transcriptional activation. Also, this combined approach demonstrates the ability to more broadly characterize CREB protein-DNA interactions wherein not only DNA binding sites are discovered, but also the potential of the promoter sequence to respond to CREB is evaluated.

  15. Toward high-throughput, multicriteria protein-structure comparison and analysis.

    Science.gov (United States)

    Shah, Azhar Ali; Folino, Gianluigi; Krasnogor, Natalio

    2010-06-01

    Protein-structure comparison (PSC) is an essential component of biomedical research as it impacts on, e.g., drug design, molecular docking, protein folding and structure prediction algorithms as well as being essential to the assessment of these predictions. Each of these applications, as well as many others where molecular comparison plays an important role, requires a different notion of similarity that naturally lead to the multicriteria PSC (MC-PSC) problem. Protein (Structure) Comparison, Knowledge, Similarity, and Information (ProCKSI) (www.procksi.org) provides algorithmic solutions for the MC-PSC problem by means of an enhanced structural comparison that relies on the principled application of information fusion to similarity assessments derived from multiple comparison methods. Current MC-PSC works well for moderately sized datasets and it is time consuming as it provides public service to multiple users. Many of the structural bioinformatics applications mentioned above would benefit from the ability to perform, for a dedicated user, thousands or tens of thousands of comparisons through multiple methods in real time, a capacity beyond our current technology. In this paper, we take a key step into that direction by means of a high-throughput distributed reimplementation of ProCKSI for very large datasets. The core of the proposed framework lies in the design of an innovative distributed algorithm that runs on each compute node in a cluster/grid environment to perform structure comparison of a given subset of input structures using some of the most popular PSC methods [e.g., universal similarity metric (USM), maximum contact map overlap (MaxCMO), fast alignment and search tool (FAST), distance alignment (DaliLite), combinatorial extension (CE), template modeling alignment (TMAlign)]. We follow this with a procedure of distributed consensus building. Thus, the new algorithms proposed here achieve ProCKSI's similarity assessment quality but with a fraction of

  16. pBaSysBioll : an integrative plasmid generating gfp transcriptional fusions for high-throughput analysis of gene expression in Bacillus subtilis

    NARCIS (Netherlands)

    Botella, Eric; Fogg, Mark; Jules, Matthieu; Piersma, Sjouke; Doherty, Geoff; Hansen, Annette; Denham, Emma. L.; Le Chat, Ludovic; Veiga, Patrick; Bailey, Kirra; Lewis, Peter J.; van Dijl, Jan Maarten; Aymerich, Stephane; Wilkinson, Anthony J.; Devine, Kevin M.

    Plasmid pBaSysBioll was constructed for high-throughput analysis of gene expression in Bacillus subtilis. It is an integrative plasmid with a ligation-independent cloning (LIC) site, allowing the generation of transcriptional gfpmut3 fusions with desired promoters. Integration is by a Campbell-type

  17. High-throughput analysis reveals novel maternal germline RNAs crucial for primordial germ cell preservation and proper migration.

    Science.gov (United States)

    Owens, Dawn A; Butler, Amanda M; Aguero, Tristan H; Newman, Karen M; Van Booven, Derek; King, Mary Lou

    2017-01-15

    During oogenesis, hundreds of maternal RNAs are selectively localized to the animal or vegetal pole, including determinants of somatic and germline fates. Although microarray analysis has identified localized determinants, it is not comprehensive and is limited to known transcripts. Here, we utilized high-throughput RNA-sequencing analysis to comprehensively interrogate animal and vegetal pole RNAs in the fully grown Xenopus laevis oocyte. We identified 411 (198 annotated) and 27 (15 annotated) enriched mRNAs at the vegetal and animal pole, respectively. Ninety were novel mRNAs over 4-fold enriched at the vegetal pole and six were over 10-fold enriched at the animal pole. Unlike mRNAs, microRNAs were not asymmetrically distributed. Whole-mount in situ hybridization confirmed that all 17 selected mRNAs were localized. Biological function and network analysis of vegetally enriched transcripts identified protein-modifying enzymes, receptors, ligands, RNA-binding proteins, transcription factors and co-factors with five defining hubs linking 47 genes in a network. Initial functional studies of maternal vegetally localized mRNAs show that sox7 plays a novel and important role in primordial germ cell (PGC) development and that ephrinB1 (efnb1) is required for proper PGC migration. We propose potential pathways operating at the vegetal pole that highlight where future investigations might be most fruitful. © 2017. Published by The Company of Biologists Ltd.

  18. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  19. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique: a review

    Science.gov (United States)

    Xiao, Li; Wei, Hui; Himmel, Michael E.; Jameel, Hasan; Kelley, Stephen S.

    2014-01-01

    Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR) and pyrolysis-molecular beam mass spectrometry (Py-mbms) are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis) and for building regression models (partial least square regression) between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated. This review

  20. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique : a review

    Directory of Open Access Journals (Sweden)

    Li eXiao

    2014-08-01

    Full Text Available Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR and pyrolysis-molecular beam mass spectrometry (Py-mbms are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis and for building regression models (partial least square regression between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated

  1. ANALYSIS OF THE SPECIAL FEATURES OF THE THERMAL PROCESS IN AN INDUCTION GENERATOR AT HIGH SATURATION OF THE MAGNETIC SYSTEM

    Directory of Open Access Journals (Sweden)

    V. Chenchevoi

    2017-06-01

    Full Text Available Purpose. Development of the method for the assessment of the thermal operation modes of an autonomous electrical power system with an induction motor, aiming at improvement of the reliability of electricity supply and the quality of electric energy. Methodology. Induction generator mathematical modeling taking into account the magnetic system saturation was used in the research. A heat model taking into account the excess of the temperature of the induction generator units in the mode of high saturation was developed. The obtained results were compared with the experimental data. Results. The paper contains the solution to the problem of improvement of the mathematical model sand methods for steel losses determination in there search of the operation modes of an autonomous uncontrolled induction generator taking into consideration the properties of the magnetic system in the mode of high saturation. The expression for determination of steel losses in the mode of high saturation is obtained. It enables the assessment of the induction generator thermal condition. Originality. The analytical dependence for the calculation of the steel losses in the mode of magnetic system saturation has been obtained for the first time. Practical value. The obtained expression for the calculation of the steel losses can be used for determination of the admissible time of generator operation at overload. It will allow avoiding broken winding insulation and complete use of the generator overload capacity. As a result, it will reduce possible irregularities of electricity supply due to the generator preliminary cutoff.

  2. Genome-wide analysis of microRNAs in rubber tree (Hevea brasiliensis L.) using high-throughput sequencing.

    Science.gov (United States)

    Lertpanyasampatha, Manassawe; Gao, Lei; Kongsawadworakul, Panida; Viboonjun, Unchera; Chrestin, Hervé; Liu, Renyi; Chen, Xuemei; Narangajavana, Jarunya

    2012-08-01

    MicroRNAs (miRNAs) are short RNAs with essential roles in gene regulation in various organisms including higher plants. In contrast to the vast information on miRNAs from many economically important plants, almost nothing has been reported on the identification or analysis of miRNAs from rubber tree (Hevea brasiliensis L.), the most important natural rubber-producing crop. To identify miRNAs and their target genes in rubber tree, high-throughput sequencing combined with a computational approach was performed. Four small RNA libraries were constructed for deep sequencing from mature and young leaves of two rubber tree clones, PB 260 and PB 217, which provide high and low latex yield, respectively. 115 miRNAs belonging to 56 known miRNA families were identified, and northern hybridization validated miRNA expression and revealed developmental stage-dependent and clone-specific expression for some miRNAs. We took advantage of the newly released rubber tree genome assembly and predicted 20 novel miRNAs. Further, computational analysis uncovered potential targets of the known and novel miRNAs. Predicted target genes included not only transcription factors but also genes involved in various biological processes including stress responses, primary and secondary metabolism, and signal transduction. In particular, genes with roles in rubber biosynthesis are predicted targets of miRNAs. This study provides a basic catalog of miRNAs and their targets in rubber tree to facilitate future improvement and exploitation of rubber tree.

  3. Evaluation of silica monoliths in affinity microcolumns for high-throughput analysis of drug-protein interactions.

    Science.gov (United States)

    Yoo, Michelle J; Hage, David S

    2009-08-01

    Silica monoliths in affinity microcolumns were tested for the high-throughput analysis of drug-protein interactions. HSA was used as a model protein for this work, while carbamazepine and R-warfarin were used as model analytes. A comparison of HSA silica monoliths of various lengths indicated columns as short as 1 to 3 mm could be used to provide reproducible estimates of retention factors or plate heights. Benefits of using smaller columns for this work included the lower retention times and lower back pressures that could be obtained versus traditional HPLC affinity columns, as well as the smaller amount of protein that is required for column preparation. One disadvantage of decreasing column length was the lower precision that resulted in retention factor and plate height measurements. A comparison was also made between microcolumns containing silica particles versus silica monoliths. It was demonstrated with R-warfarin that supports could be used in HSA microcolumns for the determination of retention factors or plate heights. However, the higher efficiency of the silica monolith made this the preferred support for work at higher flow rates or when a larger number of plates are needed during the rapid analysis of drug-protein interactions.

  4. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    Science.gov (United States)

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Analysis of saturated and aromatic hydrocarbons migrating from a polyolefin-based hot-melt adhesive into food.

    Science.gov (United States)

    Lommatzsch, Martin; Biedermann, Maurus; Grob, Koni; Simat, Thomas J

    2016-01-01

    Hot-melt adhesives are widely utilised to glue cardboard boxes used as food packaging material. They have to comply with the requirements of Article 3 of the European Framework Regulation for food contact materials (1935/2004). The hot melt raw materials analysed mainly consisted of paraffinic waxes, hydrocarbon resins and polyolefins. The hydrocarbon resins, functioning as tackifiers, were the predominant source of hydrocarbons of sufficient volatility to migrate into dry foods: the 18 hydrocarbon resins analysed contained 8.2-118 g kg(-1) saturated and up to 59 g kg(-1) aromatic hydrocarbons eluted from GC between n-C16 and n-C24, substantially more than the paraffinic waxes and the polyolefins. These tackfier resins, especially the oligomers ≤ C24, have been characterised structurally by GC×GC-MS and (1)H-NMR spectroscopy. Migration into food was estimated using a simulating system with polenta as food simulant, which was verified by the analysis of a commercial risotto rice sample packed in a virgin fibre folding box sealed with a hot melt. About 0.5-1.5% of the potentially migrating substances (between n-C16 and n-C24) of a hot melt were found to be transferred into food under storage conditions, which can result in a food contamination in the order of 1 mg kg(-1) food (depending on the amount of potentially migrating substances from the hot melt, the hot melt surface, amount of food, contact time etc.). Migrates from hot melts are easily mistaken for mineral oil hydrocarbons from recycled cardboard.

  6. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    OpenAIRE

    Husted Søren; Pedas Pai; Persson Daniel P; Laursen Kristian H; Hansen Thomas H; Schjoerring Jan K

    2009-01-01

    Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP) spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in or...

  7. High Throughput Method for Analysis of Repeat Number for 28 Phase Variable Loci of Campylobacter jejuni Strain NCTC11168.

    Directory of Open Access Journals (Sweden)

    Lea Lango-Scholey

    Full Text Available Mutations in simple sequence repeat tracts are a major mechanism of phase variation in several bacterial species including Campylobacter jejuni. Changes in repeat number of tracts located within the reading frame can produce a high frequency of reversible switches in gene expression between ON and OFF states. The genome of C. jejuni strain NCTC11168 contains 29 loci with polyG/polyC tracts of seven or more repeats. This protocol outlines a method-the 28-locus-CJ11168 PV-analysis assay-for rapidly determining ON/OFF states of 28 of these phase-variable loci in a large number of individual colonies from C. jejuni strain NCTC11168. The method combines a series of multiplex PCR assays with a fragment analysis assay and automated extraction of fragment length, repeat number and expression state. This high throughput, multiplex assay has utility for detecting shifts in phase variation states within and between populations over time and for exploring the effects of phase variation on adaptation to differing selective pressures. Application of this method to analysis of the 28 polyG/polyC tracts in 90 C. jejuni colonies detected a 2.5-fold increase in slippage products as tracts lengthened from G8 to G11 but no difference between tracts of similar length indicating that flanking sequence does not influence slippage rates. Comparison of this observed slippage to previously measured mutation rates for G8 and G11 tracts in C. jejuni indicates that PCR amplification of a DNA sample will over-estimate phase variation frequencies by 20-35-fold. An important output of the 28-locus-CJ11168 PV-analysis assay is combinatorial expression states that cannot be determined by other methods. This method can be adapted to analysis of phase variation in other C. jejuni strains and in a diverse range of bacterial species.

  8. High-throughput metabolic state analysis: The missing link in integrated functional genomics of yeasts

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Moxley, Joel. F; Åkesson, Mats Fredrik

    2005-01-01

    The lack of comparable metabolic state assays severely limits understanding the metabolic changes caused by genetic or environmental perturbations. The present study reports the application of a novel derivatization method for metabolome analysis of yeast, coupled to data-mining software that ach......The lack of comparable metabolic state assays severely limits understanding the metabolic changes caused by genetic or environmental perturbations. The present study reports the application of a novel derivatization method for metabolome analysis of yeast, coupled to data-mining software...

  9. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yonghua [Iowa State Univ., Ames, IA (United States)

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  10. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  11. Metabolic profiling of recombinant Escherichia coli cultivations based on high-throughput FT-MIR spectroscopic analysis.

    Science.gov (United States)

    Sales, Kevin C; Rosa, Filipa; Cunha, Bernardo R; Sampaio, Pedro N; Lopes, Marta B; Calado, Cecília R C

    2017-03-01

    Escherichia coli is one of the most used host microorganism for the production of recombinant products, such as heterologous proteins and plasmids. However, genetic, physiological and environmental factors influence the plasmid replication and cloned gene expression in a highly complex way. To control and optimize the recombinant expression system performance, it is very important to understand this complexity. Therefore, the development of rapid, highly sensitive and economic analytical methodologies, which enable the simultaneous characterization of the heterologous product synthesis and physiologic cell behavior under a variety of culture conditions, is highly desirable. For that, the metabolic profile of recombinant E. coli cultures producing the pVAX-lacZ plasmid model was analyzed by rapid, economic and high-throughput Fourier Transform Mid-Infrared (FT-MIR) spectroscopy. The main goal of the present work is to show as the simultaneous multivariate data analysis by principal component analysis (PCA) and direct spectral analysis could represent a very interesting tool to monitor E. coli culture processes and acquire relevant information according to current quality regulatory guidelines. While PCA allowed capturing the energetic metabolic state of the cell, e.g. by identifying different C-sources consumption phases, direct FT-MIR spectral analysis allowed obtaining valuable biochemical and metabolic information along the cell culture, e.g. lipids, RNA, protein synthesis and turnover metabolism. The information achieved by spectral multivariate data and direct spectral analyses complement each other and may contribute to understand the complex interrelationships between the recombinant cell metabolism and the bioprocess environment towards more economic and robust processes design according to Quality by Design framework. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:285-298, 2017. © 2016 American Institute of Chemical Engineers.

  12. TANGO: a generic tool for high-throughput 3D image analysis for studying nuclear organization.

    Science.gov (United States)

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2013-07-15

    The cell nucleus is a highly organized cellular organelle that contains the genetic material. The study of nuclear architecture has become an important field of cellular biology. Extracting quantitative data from 3D fluorescence imaging helps understand the functions of different nuclear compartments. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here, we describe Tools for Analysis of Nuclear Genome Organization (TANGO), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a coherent framework allowing biologists to perform the complete analysis process of 3D fluorescence images by combining two environments: ImageJ (http://imagej.nih.gov/ij/) for image processing and quantitative analysis and R (http://cran.r-project.org) for statistical processing of measurement results. It includes an intuitive user interface providing the means to precisely build a segmentation procedure and set-up analyses, without possessing programming skills. TANGO is a versatile tool able to process large sets of images, allowing quantitative study of nuclear organization. TANGO is composed of two programs: (i) an ImageJ plug-in and (ii) a package (rtango) for R. They are both free and open source, available (http://biophysique.mnhn.fr/tango) for Linux, Microsoft Windows and Macintosh OSX. Distribution is under the GPL v.2 licence. thomas.boudier@snv.jussieu.fr Supplementary data are available at Bioinformatics online.

  13. A new high-throughput LC-MS method for the analysis of complex fructan mixtures

    DEFF Research Database (Denmark)

    Verspreet, Joran; Hansen, Anders Holmgaard; Dornez, Emmie

    2014-01-01

    In this paper, a new liquid chromatography-mass spectrometry (LC-MS) method for the analysis of complex fructan mixtures is presented. In this method, columns with a trifunctional C18 alkyl stationary phase (T3) were used and their performance compared with that of a porous graphitized carbon (PGC...

  14. Hydrogel Based 3-Dimensional (3D System for Toxicity and High-Throughput (HTP Analysis for Cultured Murine Ovarian Follicles.

    Directory of Open Access Journals (Sweden)

    Hong Zhou

    Full Text Available Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN, preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR. The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased

  15. Sensitivity analysis of tracer transport in variably saturated soils at USDA-ARS OPE3 field site

    Science.gov (United States)

    The objective of this study was to assess the effects of uncertainties in hydrologic and geochemical parameters on the results of simulations of the tracer transport in variably saturated soils at the USDA-ARS OPE3 field site. A tracer experiment with a pulse of KCL solution applied to an irrigatio...

  16. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Ultrafast and high-throughput N-glycan analysis for monoclonal antibodies

    OpenAIRE

    Yang, Xiaoyu; Kim, Sunnie Myung; Ruzanski, Richard; Chen, Yuetian; Moses, Sarath; Ling, Wai Lam; Li, Xiaojuan; Wang, Shao-Chun; Li, Huijuan; Ambrogelly, Alexandre; Richardson, Daisy; Shameem, Mohammed

    2016-01-01

    Glycosylation is a critical attribute for development and manufacturing of therapeutic monoclonal antibodies (mAbs) in the pharmaceutical industry. Conventional antibody glycan analysis is usually achieved by the 2-aminobenzamide (2-AB) hydrophilic interaction liquid chromatography (HILIC) method following the release of glycans. Although this method produces satisfactory results, it has limited use for screening a large number of samples because it requires expensive reagents and takes sever...

  18. Sample Preservation, DNA or RNA Extraction and Data Analysis for High-Throughput Phytoplankton Community Sequencing

    Directory of Open Access Journals (Sweden)

    Anita Mäki

    2017-09-01

    Full Text Available Phytoplankton is the basis for aquatic food webs and mirrors the water quality. Conventionally, phytoplankton analysis has been done using time consuming and partly subjective microscopic observations, but next generation sequencing (NGS technologies provide promising potential for rapid automated examination of environmental samples. Because many phytoplankton species have tough cell walls, methods for cell lysis and DNA or RNA isolation need to be efficient to allow unbiased nucleic acid retrieval. Here, we analyzed how two phytoplankton preservation methods, three commercial DNA extraction kits and their improvements, three RNA extraction methods, and two data analysis procedures affected the results of the NGS analysis. A mock community was pooled from phytoplankton species with variation in nucleus size and cell wall hardness. Although the study showed potential for studying Lugol-preserved sample collections, it demonstrated critical challenges in the DNA-based phytoplankton analysis in overall. The 18S rRNA gene sequencing output was highly affected by the variation in the rRNA gene copy numbers per cell, while sample preservation and nucleic acid extraction methods formed another source of variation. At the top, sequence-specific variation in the data quality introduced unexpected bioinformatics bias when the sliding-window method was used for the quality trimming of the Ion Torrent data. While DNA-based analyses did not correlate with biomasses or cell numbers of the mock community, rRNA-based analyses were less affected by different RNA extraction procedures and had better match with the biomasses, dry weight and carbon contents, and are therefore recommended for quantitative phytoplankton analyses.

  19. In situ analysis and structural elucidation of sainfoin (Onobrychis viciifolia) tannins for high-throughput germplasm screening.

    Science.gov (United States)

    Gea, An; Stringano, Elisabetta; Brown, Ron H; Mueller-Harvey, Irene

    2011-01-26

    A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin ( Onobrychis viciifolia ) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6-113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.

  20. Development and Evaluation of Quality Metrics for Bioinformatics Analysis of Viral Insertion Site Data Generated Using High Throughput Sequencing.

    Science.gov (United States)

    Gao, Hongyu; Hawkins, Troy; Jasti, Aparna; Chen, Yu-Hsiang; Mockaitis, Keithanne; Dinauer, Mary; Cornetta, Kenneth

    2014-05-06

    Integration of viral vectors into a host genome is associated with insertional mutagenesis and subjects in clinical gene therapy trials must be monitored for this adverse event. Several PCR based methods such as ligase-mediated (LM) PCR, linear-amplification-mediated (LAM) PCR and non-restrictive (nr) LAM PCR were developed to identify sites of vector integration. Coupling the power of next-generation sequencing technologies with various PCR approaches will provide a comprehensive and genome-wide profiling of insertion sites and increase throughput. In this bioinformatics study, we aimed to develop and apply quality metrics to viral insertion data obtained using next-generation sequencing. We developed five simple metrics for assessing next-generation sequencing data from different PCR products and showed how the metrics can be used to objectively compare runs performed with the same methodology as well as data generated using different PCR techniques. The results will help researchers troubleshoot complex methodologies, understand the quality of sequencing data, and provide a starting point for developing standardization of vector insertion site data analysis.

  1. High-throughput metagenomic analysis of petroleum-contaminated soil microbiome reveals the versatility in xenobiotic aromatics metabolism.

    Science.gov (United States)

    Bao, Yun-Juan; Xu, Zixiang; Li, Yang; Yao, Zhi; Sun, Jibin; Song, Hui

    2017-06-01

    The soil with petroleum contamination is one of the most studied soil ecosystems due to its rich microorganisms for hydrocarbon degradation and broad applications in bioremediation. However, our understanding of the genomic properties and functional traits of the soil microbiome is limited. In this study, we used high-throughput metagenomic sequencing to comprehensively study the microbial community from petroleum-contaminated soils near Tianjin Dagang oilfield in eastern China. The analysis reveals that the soil metagenome is characterized by high level of community diversity and metabolic versatility. The metageome community is predominated by γ-Proteobacteria and α-Proteobacteria, which are key players for petroleum hydrocarbon degradation. The functional study demonstrates over-represented enzyme groups and pathways involved in degradation of a broad set of xenobiotic aromatic compounds, including toluene, xylene, chlorobenzoate, aminobenzoate, DDT, methylnaphthalene, and bisphenol. A composite metabolic network is proposed for the identified pathways, thus consolidating our identification of the pathways. The overall data demonstrated the great potential of the studied soil microbiome in the xenobiotic aromatics degradation. The results not only establish a rich reservoir for novel enzyme discovery but also provide putative applications in bioremediation. Copyright © 2016. Published by Elsevier B.V.

  2. Automated high throughput nucleic acid purification from formalin-fixed paraffin-embedded tissue samples for next generation sequence analysis.

    Science.gov (United States)

    Haile, Simon; Pandoh, Pawan; McDonald, Helen; Corbett, Richard D; Tsao, Philip; Kirk, Heather; MacLeod, Tina; Jones, Martin; Bilobram, Steve; Brooks, Denise; Smailus, Duane; Steidl, Christian; Scott, David W; Bala, Miruna; Hirst, Martin; Miller, Diane; Moore, Richard A; Mungall, Andrew J; Coope, Robin J; Ma, Yussanne; Zhao, Yongjun; Holt, Rob A; Jones, Steven J; Marra, Marco A

    2017-01-01

    Curation and storage of formalin-fixed, paraffin-embedded (FFPE) samples are standard procedures in hospital pathology laboratories around the world. Many thousands of such samples exist and could be used for next generation sequencing analysis. Retrospective analyses of such samples are important for identifying molecular correlates of carcinogenesis, treatment history and disease outcomes. Two major hurdles in using FFPE material for sequencing are the damaged nature of the nucleic acids and the labor-intensive nature of nucleic acid purification. These limitations and a number of other issues that span multiple steps from nucleic acid purification to library construction are addressed here. We optimized and automated a 96-well magnetic bead-based extraction protocol that can be scaled to large cohorts and is compatible with automation. Using sets of 32 and 91 individual FFPE samples respectively, we generated libraries from 100 ng of total RNA and DNA starting amounts with 95-100% success rate. The use of the resulting RNA in micro-RNA sequencing was also demonstrated. In addition to offering the potential of scalability and rapid throughput, the yield obtained with lower input requirements makes these methods applicable to clinical samples where tissue abundance is limiting.

  3. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP experiments.

    Directory of Open Access Journals (Sweden)

    Nicholas D Youngblut

    Full Text Available Combining high throughput sequencing with stable isotope probing (HTS-SIP is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP, multi-window high-resolution stable isotope probing (MW-HR-SIP, quantitative stable isotope probing (qSIP, and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  4. Transcriptome analysis of Emiliania huxleyi cells grown under different conditions using high-throughput sequencing data

    Science.gov (United States)

    Andreson, R.; Anlauf, H.; Mackinder, L.; Iglesias-Rodriguez, D.; LaRoche, J.; Lenhard, B.

    2012-04-01

    Coccolithophores are ideal for studying genes responsible for biomineralization processes due to relatively small genome sizes, ability to grow in culture, and as a natural model system for measuring expression of calcification-related genes in two life stages. As the Emiliania huxleyi has several annotated calcification-related proteins, we have concentrated on analyzing its genes and promoter areas. Many recent studies have focused primarily on transcriptome analysis of E. huxleyi using nutrient-limited conditions to get more information about up-regulated genes involved in biomineralization and calcification processes. Although there are more than 100,000 EST sequences for E. huxleyi available from these projects in public databases, that data is often insufficient to identify the exact position of transcription start site (TSS) to perform precise analysis (nucleotide content, motif search) of core promoters and regulatory mechanisms in immediate flanking areas. ESTs are not ideal for these kinds of analyses because the standard technologies of producing 5' EST libraries do not guarantee that the exact 5' end of the transcript will be captured. To determine the extent and accurate positions of 5' ends of transcripts and therefore the positions of core promoters, Cap analysis of gene expression (CAGE) sequencing method was used for sequencing RNA of E. huxleyi in both stages, calcifying and non-calcifying. As an additional info, gene expression levels of RNA for 21 samples were retrieved with whole transcriptome shotgun sequencing (RNA-Seq). The collections of reads these methods produced were used to map and annotate genes on several samples and measure the RNA expression levels in different conditions. Although there are not much data available for close organisms, it is possible to compare these results with other species to find conserved regulatory mechanisms between genes related to calcification. Visualization tools allowing browsing of annotated genes

  5. DNA Sudoku—harnessing high-throughput sequencing for multiplexed specimen analysis

    Science.gov (United States)

    Erlich, Yaniv; Chang, Kenneth; Gordon, Assaf; Ronen, Roy; Navon, Oron; Rooks, Michelle; Hannon, Gregory J.

    2009-01-01

    Next-generation sequencers have sufficient power to analyze simultaneously DNAs from many different specimens, a practice known as multiplexing. Such schemes rely on the ability to associate each sequence read with the specimen from which it was derived. The current practice of appending molecular barcodes prior to pooling is practical for parallel analysis of up to many dozen samples. Here, we report a strategy that permits simultaneous analysis of tens of thousands of specimens. Our approach relies on the use of combinatorial pooling strategies in which pools rather than individual specimens are assigned barcodes. Thus, the identity of each specimen is encoded within the pooling pattern rather than by its association with a particular sequence tag. Decoding the pattern allows the sequence of an original specimen to be inferred with high confidence. We verified the ability of our encoding and decoding strategies to accurately report the sequence of individual samples within a large number of mixed specimens in two ways. First, we simulated data both from a clone library and from a human population in which a sequence variant associated with cystic fibrosis was present. Second, we actually pooled, sequenced, and decoded identities within two sets of 40,000 bacterial clones comprising approximately 20,000 different artificial microRNAs targeting Arabidopsis or human genes. We achieved greater than 97% accuracy in these trials. The strategies reported here can be applied to a wide variety of biological problems, including the determination of genotypic variation within large populations of individuals. PMID:19447965

  6. PlantCV v2: Image analysis software for high-throughput plant phenotyping.

    Science.gov (United States)

    Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  7. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Malia A. Gehan

    2017-12-01

    Full Text Available Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  8. Protein surface analysis for function annotation in high-throughput structural genomics pipeline

    Science.gov (United States)

    Binkowski, T. Andrew; Joachimiak, Andrzej; Liang, Jie

    2005-01-01

    Structural genomics (SG) initiatives are expanding the universe of protein fold space by rapidly determining structures of proteins that were intentionally selected on the basis of low sequence similarity to proteins of known structure. Often these proteins have no associated biochemical or cellular functions. The SG success has resulted in an accelerated deposition of novel structures. In some cases the structural bioinformatics analysis applied to these novel structures has provided specific functional assignment. However, this approach has also uncovered limitations in the functional analysis of uncharacterized proteins using traditional sequence and backbone structure methodologies. A novel method, named pvSOAR (pocket and void Surface of Amino Acid Residues), of comparing the protein surfaces of geometrically defined pockets and voids was developed. pvSOAR was able to detect previously unrecognized and novel functional relationships between surface features of proteins. In this study, pvSOAR is applied to several structural genomics proteins. We examined the surfaces of YecM, BioH, and RpiB from Escherichia coli as well as the CBS domains from inosine-5′-monosphate dehydrogenase from Streptococcus pyogenes, conserved hypothetical protein Ta549 from Thermoplasm acidophilum, and CBS domain protein mt1622 from Methanobacterium thermoautotrophicum with the goal to infer information about their biochemical function. PMID:16322579

  9. Unifying the analysis of high-throughput sequencing datasets: characterizing RNA-seq, 16S rRNA gene sequencing and selective growth experiments by compositional data analysis.

    Science.gov (United States)

    Fernandes, Andrew D; Reid, Jennifer Ns; Macklaim, Jean M; McMurrough, Thomas A; Edgell, David R; Gloor, Gregory B

    2014-01-01

    Experimental designs that take advantage of high-throughput sequencing to generate datasets include RNA sequencing (RNA-seq), chromatin immunoprecipitation sequencing (ChIP-seq), sequencing of 16S rRNA gene fragments, metagenomic analysis and selective growth experiments. In each case the underlying data are similar and are composed of counts of sequencing reads mapped to a large number of features in each sample. Despite this underlying similarity, the data analysis methods used for these experimental designs are all different, and do not translate across experiments. Alternative methods have been developed in the physical and geological sciences that treat similar data as compositions. Compositional data analysis methods transform the data to relative abundances with the result that the analyses are more robust and reproducible. Data from an in vitro selective growth experiment, an RNA-seq experiment and the Human Microbiome Project 16S rRNA gene abundance dataset were examined by ALDEx2, a compositional data analysis tool that uses Bayesian methods to infer technical and statistical error. The ALDEx2 approach is shown to be suitable for all three types of data: it correctly identifies both the direction and differential abundance of features in the differential growth experiment, it identifies a substantially similar set of differentially expressed genes in the RNA-seq dataset as the leading tools and it identifies as differential the taxa that distinguish the tongue dorsum and buccal mucosa in the Human Microbiome Project dataset. The design of ALDEx2 reduces the number of false positive identifications that result from datasets composed of many features in few samples. Statistical analysis of high-throughput sequencing datasets composed of per feature counts showed that the ALDEx2 R package is a simple and robust tool, which can be applied to RNA-seq, 16S rRNA gene sequencing and differential growth datasets, and by extension to other techniques that use a

  10. μTAS (micro total analysis systems) for the high-throughput measurement of nanomaterial solubility

    Science.gov (United States)

    Tantra, R.; Jarman, J.

    2013-04-01

    There is a consensus in the nanoecotoxicology community that better analytical tools i.e. faster and more accurate ones, are needed for the physicochemical characterisation of nanomaterials in environmentally/biologically relevant media. In this study, we introduce the concept of μTAS (Micro Total Analysis Systems), which was a term coined to encapsulate the integration of laboratory processes on a single microchip. Our focus here is on the use of a capillary electrophoresis (CE) with conductivity detection microchip and how this may be used for the measurement of dissolution of metal oxide nanomaterials. Our preliminary results clearly show promise in that the device is able to: a) measure ionic zinc in various ecotox media with high selectivity b) track the dynamic dissolution events of zinc oxide (ZnO) nanomaterial when dispersed in fish medium.

  11. The significance, development and progress of high-throughput combinatorial histone code analysis.

    Science.gov (United States)

    Young, Nicolas L; Dimaggio, Peter A; Garcia, Benjamin A

    2010-12-01

    The physiological state of eukaryotic DNA is chromatin. Nucleosomes, which consist of DNA in complex with histones, are the fundamental unit of chromatin. The post-translational modifications (PTMs) of histones play a critical role in the control of gene transcription, epigenetics and other DNA-templated processes. It has been known for several years that these PTMs function in concert to allow for the storage and transduction of highly specific signals through combinations of modifications. This code, the combinatorial histone code, functions much like a bar code or combination lock providing the potential for massive information content. The capacity to directly measure these combinatorial histone codes has mostly been laborious and challenging, thus limiting efforts often to one or two samples. Recently, progress has been made in determining such information quickly, quantitatively and sensitively. Here we review both the historical and recent progress toward routine and rapid combinatorial histone code analysis.

  12. Peptide Pattern Recognition for high-throughput protein sequence analysis and clustering

    DEFF Research Database (Denmark)

    Busk, Peter Kamp

    2017-01-01

    Large collections of protein sequences with divergent sequences are tedious to analyze for understanding their phylogenetic or structure-function relation. Peptide Pattern Recognition is an algorithm that was developed to facilitate this task but the previous version does only allow a limited...... number of sequences as input. I implemented Peptide Pattern Recognition as a multithread software designed to handle large numbers of sequences and perform analysis in a reasonable time frame. Benchmarking showed that the new implementation of Peptide Pattern Recognition is twenty times faster than...... the previous implementation on a small protein collection with 673 MAP kinase sequences. In addition, the new implementation could analyze a large protein collection with 48,570 Glycosyl Transferase family 20 sequences without reaching its upper limit on a desktop computer. Peptide Pattern Recognition...

  13. High-throughput LC/MS/MS analysis of ruscogenin and neoruscogenin in Ruscus aculeatus L.

    Science.gov (United States)

    Vlase, Laurian; Kiss, Béla; Balica, Georgeta; Tămas, Mircea; Crisan, Gianina; Leucuta, Sorin E

    2009-01-01

    A new, sensitive LC/MS/MS method was developed for the quantification of ruscogenin and neoruscogenin in hydrolyzed extracts from Ruscus aculeatus L. (Liliaceae). The two sapogenins were separated on a Zorbax SB-C18 column under isocratic conditions. The detection was performed in the multiple reaction monitoring mode using an ion trap mass spectrometer with an electrospray ionization source operated in positive ionization mode. For the quantification of the ruscogenin and neoruscogenin, calibration curves were constructed over the range of 2-1000 ng/mL. This is the first reported LC/MS/MS method for the simultaneous analysis of ruscogenin and neoruscogenin, and it showed superior sensitivity when compared with other assays described in the literature. The method has been successfully applied to quantify the two sapogenins in aerial (phylloclades) and underground parts (rhizomes, roots) of Ruscus aculeatus L.

  14. A Scalable Epitope Tagging Approach for High Throughput ChIP-Seq Analysis.

    Science.gov (United States)

    Xiong, Xiong; Zhang, Yanxiao; Yan, Jian; Jain, Surbhi; Chee, Sora; Ren, Bing; Zhao, Huimin

    2017-06-16

    Eukaryotic transcriptional factors (TFs) typically recognize short genomic sequences alone or together with other proteins to modulate gene expression. Mapping of TF-DNA interactions in the genome is crucial for understanding the gene regulatory programs in cells. While chromatin immunoprecipitation followed by sequencing (ChIP-Seq) is commonly used for this purpose, its application is severely limited by the availability of suitable antibodies for TFs. To overcome this limitation, we developed an efficient and scalable strategy named cmChIP-Seq that combines the clustered regularly interspaced short palindromic repeats (CRISPR) technology with microhomology mediated end joining (MMEJ) to genetically engineer a TF with an epitope tag. We demonstrated the utility of this tool by applying it to four TFs in a human colorectal cancer cell line. The highly scalable procedure makes this strategy ideal for ChIP-Seq analysis of TFs in diverse species and cell types.

  15. MCAM: multiple clustering analysis methodology for deriving hypotheses and insights from high-throughput proteomic datasets.

    Directory of Open Access Journals (Sweden)

    Kristen M Naegle

    2011-07-01

    Full Text Available Advances in proteomic technologies continue to substantially accelerate capability for generating experimental data on protein levels, states, and activities in biological samples. For example, studies on receptor tyrosine kinase signaling networks can now capture the phosphorylation state of hundreds to thousands of proteins across multiple conditions. However, little is known about the function of many of these protein modifications, or the enzymes responsible for modifying them. To address this challenge, we have developed an approach that enhances the power of clustering techniques to infer functional and regulatory meaning of protein states in cell signaling networks. We have created a new computational framework for applying clustering to biological data in order to overcome the typical dependence on specific a priori assumptions and expert knowledge concerning the technical aspects of clustering. Multiple clustering analysis methodology ('MCAM' employs an array of diverse data transformations, distance metrics, set sizes, and clustering algorithms, in a combinatorial fashion, to create a suite of clustering sets. These sets are then evaluated based on their ability to produce biological insights through statistical enrichment of metadata relating to knowledge concerning protein functions, kinase substrates, and sequence motifs. We applied MCAM to a set of dynamic phosphorylation measurements of the ERRB network to explore the relationships between algorithmic parameters and the biological meaning that could be inferred and report on interesting biological predictions. Further, we applied MCAM to multiple phosphoproteomic datasets for the ERBB network, which allowed us to compare independent and incomplete overlapping measurements of phosphorylation sites in the network. We report specific and global differences of the ERBB network stimulated with different ligands and with changes in HER2 expression. Overall, we offer MCAM as a broadly

  16. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  17. High throughput phenotypic analysis of Mycobacterium tuberculosis and Mycobacterium bovis strains' metabolism using biolog phenotype microarrays.

    Directory of Open Access Journals (Sweden)

    Bhagwati Khatri

    Full Text Available Tuberculosis is a major human and animal disease of major importance worldwide. Genetically, the closely related strains within the Mycobacterium tuberculosis complex which cause disease are well-characterized but there is an urgent need better to understand their phenotypes. To search rapidly for metabolic differences, a working method using Biolog Phenotype MicroArray analysis was developed. Of 380 substrates surveyed, 71 permitted tetrazolium dye reduction, the readout over 7 days in the method. By looking for ≥5-fold differences in dye reduction, 12 substrates differentiated M. tuberculosis H37Rv and Mycobacterium bovis AF2122/97. H37Rv and a Beijing strain of M. tuberculosis could also be distinguished in this way, as could field strains of M. bovis; even pairs of strains within one spoligotype could be distinguished by 2 to 3 substrates. Cluster analysis gave three clear groups: H37Rv, Beijing, and all the M. bovis strains. The substrates used agreed well with prior knowledge, though an unexpected finding that AF2122/97 gave greater dye reduction than H37Rv with hexoses was investigated further, in culture flasks, revealing that hexoses and Tween 80 were synergistic for growth and used simultaneously rather than in a diauxic fashion. Potential new substrates for growth media were revealed, too, most promisingly N-acetyl glucosamine. Osmotic and pH arrays divided the mycobacteria into two groups with different salt tolerance, though in contrast to the substrate arrays the groups did not entirely correlate with taxonomic differences. More interestingly, these arrays suggested differences between the amines used by the M. tuberculosis complex and enteric bacteria in acid tolerance, with some hydrophobic amino acids being highly effective. In contrast, γ-aminobutyrate, used in the enteric bacteria, had no effect in the mycobacteria. This study proved principle that Phenotype MicroArrays can be used with slow-growing pathogenic mycobacteria

  18. DETERMINATION OF SATURATION VAPOR PRESSURE OF LOW VOLATILE SUBSTANCES THROUGH THE STUDY OF EVAPORATION RATE BY THERMOGRAVIMETRIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    R. V. Ralys

    2015-11-01

    Full Text Available Subject of Study.Research of vapor pressure of low volatile substances is a complicated problem due to both direct experimental implementation complexity and, most significantly, the issues faced correctness of the analysis and processing of experimental data. That is why it is usually required engaging the reference substances (with vapor pressures well studied. The latter drastically reduces the effectiveness of the experimental methods used and narrows their applicability. The paper deals with an approach to the evaporation process description (sublimation of low volatile substances based on molecular kinetic description in view of diffusive and convection processes. The proposed approach relies on experimental thermogravimetricfindingsina wide range of temperatures, flow rates ofthe purge gas and time. Method. A new approach is based on the calculation of the vapor pressure and uses the data about the speed of evaporation by thermogravimetric analysis depending on the temperature, the flow rate of the purge gas, and the evaporation time. The basis for calculation is the diffusion-kinetic description of the process of evaporation (mass loss of the substance from the exposed surface. The method is applicable to determine the thermodynamic characteristics for both the evaporation (the equilibrium liquid - vapor and sublimation (the equilibrium solid - vapor. We proposed the appropriate method of the experiment and analysis of its data in order to find the saturated vapor pressure of individual substances of low volatility. Main Results. The method has been tested on substances with insufficiently reliable and complete study of the thermodynamic characteristics but, despite this, are often used (because of the other data limitations as reference ones. The vaporization process (liquid-vapor has been studied for di-n-butyl phthalate C16H22O4 at 323,15–443,15 К, and sublimation for benzoic acid C7H6O2at 303,15–183,15 К. Both processes have

  19. A novel approach for transcription factor analysis using SELEX with high-throughput sequencing (TFAST.

    Directory of Open Access Journals (Sweden)

    Daniel J Reiss

    Full Text Available BACKGROUND: In previous work, we designed a modified aptamer-free SELEX-seq protocol (afSELEX-seq for the discovery of transcription factor binding sites. Here, we present original software, TFAST, designed to analyze afSELEX-seq data, validated against our previously generated afSELEX-seq dataset and a model dataset. TFAST is designed with a simple graphical interface (Java so that it can be installed and executed without extensive expertise in bioinformatics. TFAST completes analysis within minutes on most personal computers. METHODOLOGY: Once afSELEX-seq data are aligned to a target genome, TFAST identifies peaks and, uniquely, compares peak characteristics between cycles. TFAST generates a hierarchical report of graded peaks, their associated genomic sequences, binding site length predictions, and dummy sequences. PRINCIPAL FINDINGS: Including additional cycles of afSELEX-seq improved TFAST's ability to selectively identify peaks, leading to 7,274, 4,255, and 2,628 peaks identified in two-, three-, and four-cycle afSELEX-seq. Inter-round analysis by TFAST identified 457 peaks as the strongest candidates for true binding sites. Separating peaks by TFAST into classes of worst, second-best and best candidate peaks revealed a trend of increasing significance (e-values 4.5 × 10(12, 2.9 × 10(-46, and 1.2 × 10(-73 and informational content (11.0, 11.9, and 12.5 bits over 15 bp of discovered motifs within each respective class. TFAST also predicted a binding site length (28 bp consistent with non-computational experimentally derived results for the transcription factor PapX (22 to 29 bp. CONCLUSIONS/SIGNIFICANCE: TFAST offers a novel and intuitive approach for determining DNA binding sites of proteins subjected to afSELEX-seq. Here, we demonstrate that TFAST, using afSELEX-seq data, rapidly and accurately predicted sequence length and motif for a putative transcription factor's binding site.

  20. A community resource for high-throughput quantitative RT-PCR analysis of transcription factor gene expression in Medicago truncatula

    Directory of Open Access Journals (Sweden)

    Redman Julia C

    2008-07-01

    Full Text Available Abstract Background Medicago truncatula is a model legume species that is currently the focus of an international genome sequencing effort. Although several different oligonucleotide and cDNA arrays have been produced for genome-wide transcript analysis of this species, intrinsic limitations in the sensitivity of hybridization-based technologies mean that transcripts of genes expressed at low-levels cannot be measured accurately with these tools. Amongst such genes are many encoding transcription factors (TFs, which are arguably the most important class of regulatory proteins. Quantitative reverse transcription-polymerase chain reaction (qRT-PCR is the most sensitive method currently available for transcript quantification, and one that can be scaled up to analyze transcripts of thousands of genes in parallel. Thus, qRT-PCR is an ideal method to tackle the problem of TF transcript quantification in Medicago and other plants. Results We established a bioinformatics pipeline to identify putative TF genes in Medicago truncatula and to design gene-specific oligonucleotide primers for qRT-PCR analysis of TF transcripts. We validated the efficacy and gene-specificity of over 1000 TF primer pairs and utilized these to identify sets of organ-enhanced TF genes that may play important roles in organ development or differentiation in this species. This community resource will be developed further as more genome sequence becomes available, with the ultimate goal of producing validated, gene-specific primers for all Medicago TF genes. Conclusion High-throughput qRT-PCR using a 384-well plate format enables rapid, flexible, and sensitive quantification of all predicted Medicago transcription factor mRNAs. This resource has been utilized recently by several groups in Europe, Australia, and the USA, and we expect that it will become the 'gold-standard' for TF transcript profiling in Medicago truncatula.

  1. Comparison of analysis tools for miRNA high throughput sequencing using nerve crush as a model

    Directory of Open Access Journals (Sweden)

    Raghu Prasad Rao Metpally

    2013-03-01

    Full Text Available Recent advances in sample preparation and analysis for next generation sequencing have made it possible to profile and discover new miRNAs in a high throughput manner. In the case of neurological disease and injury, these types of experiments have been more limited. Possibly because tissues such as the brain and spinal cord are inaccessible for direct sampling in living patients, and indirect sampling of blood and cerebrospinal fluid are affected by low amounts of RNA. We used a mouse model to examine changes in miRNA expression in response to acute nerve crush. We assayed miRNA from both muscle tissue and blood plasma. We examined how the depth of coverage (the number of mapped reads changed the number of detectable miRNAs in each sample type. We also found that samples with very low starting amounts of RNA (mouse plasma made high depth of mature miRNA coverage more difficult to obtain. Each tissue must be assessed independently for the depth of coverage required to adequately power detection of differential expression, weighed against the cost of sequencing that sample to the adequate depth. We explored the changes in total mapped reads and differential expression results generated by three different software packages: miRDeep2, miRNAKey, and miRExpress and two different analysis packages, DESeq and EdgeR. We also examine the accuracy of using miRDeep2 to predict novel miRNAs and subsequently detect them in the samples using qRT-PCR.

  2. High-throughput screening of suppression subtractive hybridization cDNA libraries using DNA microarray analysis.

    Science.gov (United States)

    van den Berg, Noëlani; Crampton, Bridget G; Hein, Ingo; Birch, Paul R J; Berger, Dave K

    2004-11-01

    Efficient construction of cDNA libraries enriched for differentially expressed transcripts is an important first step in many biological investigations. We present a quantitative procedure for screening cDNA libraries constructed by suppression subtractive hybridization (SSH). The methodology was applied to two independent SSHs from pearl millet and banana. Following two-color cyanin dye labeling and hybridization of subtracted tester with either unsubtracted driver or unsubtracted tester cDNAs to the SSH libraries arrayed on glass slides, two values were calculated for each clone, an enrichment ratio 1 (ER1) and an enrichment ratio 2 (ER2). Graphical representation of ER1 and ER2 enabled the identification of clones that were likely to represent up-regulated transcripts. Normalization of each clone by the SSH process was determined from the ER2 values, thereby indicating whether clones represented rare or abundant transcripts. Differential expression of pearl millet and banana clones identified from both libraries by this quantitative approach was verified by inverse Northern blot analysis.

  3. High throughput genetic analysis of congenital myasthenic syndromes using resequencing microarrays.

    Directory of Open Access Journals (Sweden)

    Lisa Denning

    Full Text Available BACKGROUND: The use of resequencing microarrays for screening multiple, candidate disease loci is a promising alternative to conventional capillary sequencing. We describe the performance of a custom resequencing microarray for mutational analysis of Congenital Myasthenic Syndromes (CMSs, a group of disorders in which the normal process of neuromuscular transmission is impaired. METHODOLOGY/PRINCIPAL FINDINGS: Our microarray was designed to assay the exons and flanking intronic regions of 8 genes linked to CMSs. A total of 31 microarrays were hybridized with genomic DNA from either individuals with known CMS mutations or from healthy controls. We estimated an overall microarray call rate of 93.61%, and we found the percentage agreement between the microarray and capillary sequencing techniques to be 99.95%. In addition, our microarray exhibited 100% specificity and 99.99% reproducibility. Finally, the microarray detected 22 out of the 23 known missense mutations, but it failed to detect all 7 known insertion and deletion (indels mutations, indicating an overall sensitivity of 73.33% and a sensitivity with respect to missense mutations of 95.65%. CONCLUSIONS/SIGNIFICANCE: Overall, our microarray prototype exhibited strong performance and proved highly efficient for screening genes associated with CMSs. Until indels can be efficiently assayed with this technology, however, we recommend using resequencing microarrays for screening CMS mutations after common indels have been first assayed by capillary sequencing.

  4. Differentiation and identification of filamentous fungi by high-throughput FTIR spectroscopic analysis of mycelia.

    Science.gov (United States)

    Lecellier, A; Mounier, J; Gaydou, V; Castrec, L; Barbier, G; Ablain, W; Manfait, M; Toubas, D; Sockalingum, G D

    2014-01-03

    Routine identification of fungi based on phenotypic and genotypic methods can be fastidious and time-consuming. In this context, there is a constant need for new approaches allowing the rapid identification of molds. Fourier-transform infrared (FTIR) spectroscopy appears as such an indicated method. The objective of this work was to evaluate the potential of FTIR spectroscopy for an early differentiation and identification of filamentous fungi. One hundred and thirty-one strains identified using DNA sequencing, were analyzed using FTIR spectroscopy of the mycelia obtained after a reduced culture time of 48 h compared to current conventional methods. Partial least square discriminant analysis was used as a chemometric method to analyze the spectral data and for identification of the fungal strains from the phylum to the species level. Calibration models were constructed using 106 strains pertaining to 14 different genera and 32 species and were used to identify 25 fungal strains in a blind manner. Identification levels of 98.97% and 98.77% achieved were correctly assigned to the genus and species levels respectively. FTIR spectroscopy with its high discriminating power and rapidity therefore shows strong promise for routine fungal identification. Upgrading of our database is ongoing to test the technique's robustness. © 2013.

  5. Designing small universal k-mer hitting sets for improved analysis of high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Yaron Orenstein

    2017-10-01

    Full Text Available With the rapidly increasing volume of deep sequencing data, more efficient algorithms and data structures are needed. Minimizers are a central recent paradigm that has improved various sequence analysis tasks, including hashing for faster read overlap detection, sparse suffix arrays for creating smaller indexes, and Bloom filters for speeding up sequence search. Here, we propose an alternative paradigm that can lead to substantial further improvement in these and other tasks. For integers k and L > k, we say that a set of k-mers is a universal hitting set (UHS if every possible L-long sequence must contain a k-mer from the set. We develop a heuristic called DOCKS to find a compact UHS, which works in two phases: The first phase is solved optimally, and for the second we propose several efficient heuristics, trading set size for speed and memory. The use of heuristics is motivated by showing the NP-hardness of a closely related problem. We show that DOCKS works well in practice and produces UHSs that are very close to a theoretical lower bound. We present results for various values of k and L and by applying them to real genomes show that UHSs indeed improve over minimizers. In particular, DOCKS uses less than 30% of the 10-mers needed to span the human genome compared to minimizers. The software and computed UHSs are freely available at github.com/Shamir-Lab/DOCKS/ and acgt.cs.tau.ac.il/docks/, respectively.

  6. Palm Oil Consumption Increases LDL Cholesterol Compared with Vegetable Oils Low in Saturated Fat in a Meta-Analysis of Clinical Trials.

    Science.gov (United States)

    Sun, Ye; Neelakantan, Nithya; Wu, Yi; Lote-Oke, Rashmi; Pan, An; van Dam, Rob M

    2015-07-01

    Palm oil contains a high amount of saturated fat compared with most other vegetable oils, but studies have reported inconsistent effects of palm oil on blood lipids. We systematically reviewed the effect of palm oil consumption on blood lipids compared with other cooking oils using data from clinical trials. We searched PubMed and the Cochrane Library for trials of at least 2 wk duration that compared the effects of palm oil consumption with any of the predefined comparison oils: vegetable oils low in saturated fat, trans fat-containing partially hydrogenated vegetable oils, and animal fats. Data were pooled by using random-effects meta-analysis. Palm oil significantly increased LDL cholesterol by 0.24 mmol/L (95% CI: 0.13, 0.35 mmol/L; I(2) = 83.2%) compared with vegetable oils low in saturated fat. This effect was observed in randomized trials (0.31 mmol/L; 95% CI: 0.20, 0.42 mmol/L) but not in nonrandomized trials (0.03 mmol/L; 95% CI: -0.15, 0.20 mmol/L; P-difference = 0.02). Among randomized trials, only modest heterogeneity in study results remained after considering the test oil dose and the comparison oil type (I(2) = 27.5%). Palm oil increased HDL cholesterol by 0.02 mmol/L (95% CI: 0.01, 0.04 mmol/L; I(2) = 49.8%) compared with vegetable oils low in saturated fat and by 0.09 mmol/L (95% CI: 0.06, 0.11 mmol/L; I(2) = 47.8%) compared with trans fat-containing oils. Palm oil consumption results in higher LDL cholesterol than do vegetable oils low in saturated fat and higher HDL cholesterol than do trans fat-containing oils in humans. The effects of palm oil on blood lipids are as expected on the basis of its high saturated fat content, which supports the reduction in palm oil use by replacement with vegetable oils low in saturated and trans fat. This systematic review was registered with the PROSPERO registry at http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42012002601#.VU3wvSGeDRZ as CRD42012002601. © 2015 American Society for Nutrition.

  7. High-Throughput Genetic Analysis and Combinatorial Chiral Separations Based on Capillary Electrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Zhong, Wenwan [Iowa State Univ., Ames, IA (United States)

    2003-01-01

    Capillary electrophoresis (CE) offers many advantages over conventional analytical methods, such as speed, simplicity, high resolution, low cost, and small sample consumption, especially for the separation of enantiomers. However, chiral method developments still can be time consuming and tedious. They designed a comprehensive enantioseparation protocol employing neutral and sulfated cyclodextrins as chiral selectors for common basic, neutral, and acidic compounds with a 96-capillary array system. By using only four judiciously chosen separation buffers, successful enantioseparations were achieved for 49 out of 54 test compounds spanning a large variety of pKs and structures. Therefore, unknown compounds can be screened in this manner to identify optimal enantioselective conditions in just one rn. In addition to superior separation efficiency for small molecules, CE is also the most powerful technique for DNA separations. Using the same multiplexed capillary system with UV absorption detection, the sequence of a short DNA template can be acquired without any dye-labels. Two internal standards were utilized to adjust the migration time variations among capillaries, so that the four electropherograms for the A, T, C, G Sanger reactions can be aligned and base calling can be completed with a high level of confidence. the CE separation of DNA can be applied to study differential gene expression as well. Combined with pattern recognition techniques, small variations among electropherograms obtained by the separation of cDNA fragments produced from the total RNA samples of different human tissues can be revealed. These variations reflect the differences in total RNA expression among tissues. Thus, this Ce-based approach can serve as an alternative to the DNA array techniques in gene expression analysis.

  8. High-throughput multiplex single-nucleotide polymorphism analysis for red cell and platelet antigen genotypes.

    Science.gov (United States)

    Denomme, G A; Van Oene, M

    2005-05-01

    Transfusion recipients who become alloimmunized to red cell or platelet (PLT) antigens require antigen-negative blood to limit adverse transfusion reactions. Blood collection facilities use regulated and unregulated antibodies to phenotype blood, the cost of which can be prohibitive depending on the antisera and demand. An alternative strategy is to screen blood for these antigens with genomic DNA and the associated single-nucleotide polymorphisms (SNPs). A multiplex polymerase chain reaction (PCR)-oligonucleotide extension assay was developed with genomic DNA and a SNP genotyping platform (GenomeLab SNPstream, Beckman Coulter) to identify SNPs related to D, C/c, E, S/s, K/k, Kp(a/b), Fy(a/b), FY0 (-33 promoter silencing polymorphism), Jk(a/b), Di(a/b), and human PLT antigen (HPA)-1a/1b. A total of 372 samples were analyzed for 12 SNPs. The genotypes were compared to the blood group and PLT antigen phenotypes. Individual sample results varied from 98 to 100 percent for 11 of 12 SNPs. D was correctly identified in 292 of 296 (98.6%) D+ donors. The RHCE exon 5 E/e SNP analysis had the lowest concordance (89.5%). Thirty-three R(1)R(1) and 1 r"r were correctly identified. PCR-restriction fragment length polymorphism (RFLP) on selected samples confirmed the presence of the FY0 silencing polymorphism in nine donors. Homozygous HPA-1b/1b was identified in four donors, which was confirmed by PCR-RFLP (n = 4) and anti-HPA-1a serology (n = 2). The two HPA-1a-negative donors were recruited into the plateletpheresis program. The platform has the capacity to genotype thousands of samples per day. The suite of SNPs provides genotype data for all blood donors within 36 hours of the start of testing.

  9. MultipleLegionella pneumophilaeffector virulence phenotypes revealed through high-throughput analysis of targeted mutant libraries.

    Science.gov (United States)

    Shames, Stephanie R; Liu, Luying; Havey, James C; Schofield, Whitman B; Goodman, Andrew L; Roy, Craig R

    2017-11-28

    Legionella pneumophila is the causative agent of a severe pneumonia called Legionnaires' disease. A single strain of L. pneumophila encodes a repertoire of over 300 different effector proteins that are delivered into host cells by the Dot/Icm type IV secretion system during infection. The large number of L. pneumophila effectors has been a limiting factor in assessing the importance of individual effectors for virulence. Here, a transposon insertion sequencing technology called INSeq was used to analyze replication of a pool of effector mutants in parallel both in a mouse model of infection and in cultured host cells. Loss-of-function mutations in genes encoding effector proteins resulted in host-specific or broad virulence phenotypes. Screen results were validated for several effector mutants displaying different virulence phenotypes using genetic complementation studies and infection assays. Specifically, loss-of-function mutations in the gene encoding LegC4 resulted in enhanced L. pneumophila in the lungs of infected mice but not within cultured host cells, which indicates LegC4 augments bacterial clearance by the host immune system. The effector proteins RavY and Lpg2505 were important for efficient replication within both mammalian and protozoan hosts. Further analysis of Lpg2505 revealed that this protein functions as a metaeffector that counteracts host cytotoxicity displayed by the effector protein SidI. Thus, this study identified a large cohort of effectors that contribute to L. pneumophila virulence positively or negatively and has demonstrated regulation of effector protein activities by cognate metaeffectors as being critical for host pathogenesis.

  10. Extended and saturation prostatic biopsy in the diagnosis and characterisation of prostate cancer: a critical analysis of the literature.

    Science.gov (United States)

    Scattoni, Vincenzo; Zlotta, Alexandre; Montironi, Rodolfo; Schulman, Claude; Rigatti, Patrizio; Montorsi, Francesco

    2007-11-01

    To review and critically analyse all the recent literature on the detection and characterisation of prostate cancer by means of extended and saturation protocols. A systematic review of the literature was performed by searching MedLine from January 1995 to April 2007. Electronic searches were limited to the English language, and the key words "prostate cancer," "diagnosis," "transrectal ultrasound (TRUS)," "prostate biopsy," and "prognosis" were used. The prostate biopsy technique has changed significantly since the original Hodge sextant biopsy protocol. Several types of local anaesthesia are now available, but periprostatic nerve block (PPNB) has proved to be the most effective method to reduce pain during TRUS biopsy. It remains controversial whether PPNB should be associated with other medications. The optimal extended protocol (sextant template with at least four additional cores) should include six standard sextant biopsies, with additional biopsies (up to 12 cores) taken more laterally (anterior horn) to the base and medially to the apex. Repeat biopsies should be based on saturation biopsies (number of cores >/= 20) and should include the transition zone, especially in a patient with an initial negative biopsy. As a means of increasing accuracy of prostatic biopsy and reducing unnecessary prostate biopsy, colour and power Doppler imaging, with or without contrast enhancement, and elastography now can be successfully adopted, but their routine use is still controversial. Extended and saturation biopsy schemes should be performed at first and repeat biopsy, respectively. The widespread use of local anaesthesia makes the procedures more comfortable.

  11. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    Directory of Open Access Journals (Sweden)

    Manabu Kinoshita

    Full Text Available Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006 and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73. Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively. ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p < 0.0001, AUC = 0.83, respectively. Finally, IDH1 wild type gliomas showed statistically lower Shannon entropy on T2WI than IDH1 mutated gliomas (p = 0.007 but no difference was observed between IDH1 wild type and mutated gliomas in Edge median values using Prewitt filtering. The current study introduced two image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  12. High-throughput phenotyping of lateral expansion and regrowth of spaced Lolium perenne plants using on-field image analysis.

    Science.gov (United States)

    Lootens, Peter; Ruttink, Tom; Rohde, Antje; Combes, Didier; Barre, Philippe; Roldán-Ruiz, Isabel

    2016-01-01

    Genetic studies and breeding of agricultural crops frequently involve phenotypic characterization of large collections of genotypes grown in field conditions. These evaluations are typically based on visual observations and manual (destructive) measurements. Robust image capture and analysis procedures that allow phenotyping large collections of genotypes in time series during developmental phases represent a clear advantage as they allow non-destructive monitoring of plant growth and performance. A L. perenne germplasm panel including wild accessions, breeding material and commercial varieties has been used to develop a low-cost, high-throughput phenotyping tool for determining plant growth based on images of individual plants during two consecutive growing seasons. Further we have determined the correlation between image analysis-based estimates of the plant's base area and the capacity to regrow after cutting, with manual counts of tiller number and measurements of leaf growth 2 weeks after cutting, respectively. When working with field-grown plants, image acquisition and image segmentation are particularly challenging as outdoor light conditions vary throughout the day and the season, and variable soil colours hamper the delineation of the object of interest in the image. Therefore we have used several segmentation methods including colour-, texture- and edge-based approaches, and factors derived after a fast Fourier transformation. The performance of the procedure developed has been analysed in terms of effectiveness across different environmental conditions and time points in the season. The procedure developed was able to analyse correctly 77.2 % of the 24,048 top view images processed. High correlations were found between plant's base area (image analysis-based) and tiller number (manual measurement) and between regrowth after cutting (image analysis-based) and leaf growth 2 weeks after cutting (manual measurement), with r values up to 0.792 and 0

  13. High-throughput multi-parameter flow-cytometric analysis from micro-quantities of plasmodium-infected blood.

    Science.gov (United States)

    Apte, Simon H; Groves, Penny L; Roddick, Joanne S; P da Hora, Vanusa; Doolan, Denise L

    2011-10-01

    Despite significant technological and conceptual advances over the last century, evaluation of the efficacy of anti-malarial vaccines or drugs continues to rely principally on direct microscopic visualisation of parasites on thick and/or thin Giemsa-stained blood smears. This requires technical expertise of the microscopist, is highly subjective and error-prone, and does not account for aberrations such as anaemia. Many published methods have shown that flow cytometric analysis of blood is a highly versatile method that can readily detect nucleic acid-stained parasitised red blood cells within cultured cell populations and in ex-vivo samples. However several impediments, including the difficulty in distinguishing reticulocytes from infected red blood cells and the fickle nature of red blood cells, have precluded the development and universal adoption of flow-cytometric based assays for ex-vivo sample analysis. We have developed a novel high-throughput assay for the flow cytometric assessment of blood that overcomes these impediments by utilising the unique properties of the nucleic acid stain DAPI to differentially stain RNA and DNA, combined with novel fixation and analysis protocols. The assay allows the rapid and reliable analysis of multiple parameters from micro-volumes of blood, including: parasitaemia, platelet count, reticulocyte count, normocyte count, white blood cell count and delineation of subsets and phenotypic markers including, but not limited to, CD4(+) and CD8(+) T cells, and the expression of phenotypic markers such as PD-L1 or intracellular cytokines. The assay requires less than one drop of blood and is therefore suitable for short interval time-course experiments and allows the progression of infection and immune responses to be closely monitored in the laboratory or cytometer-equipped field locations. Herein, we describe the technique and demonstrate its application in vaccinology and with a range of rodent and human parasite species

  14. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    Full Text Available BACKGROUND: Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges. METHODOLOGY/PRINCIPAL FINDINGS: We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences. CONCLUSIONS/SIGNIFICANCE: Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall

  15. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data.

    Science.gov (United States)

    Correia, Damien; Doppelt-Azeroual, Olivia; Denis, Jean-Baptiste; Vandenbogaert, Mathias; Caro, Valérie

    2015-01-01

    The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS) or Next-Generation Sequencing (NGS) technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS), solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users' input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power). Galaxy is used to handle and analyze the user's input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy's main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration of intuitive

  16. Intestinal microbiota in healthy U.S. young children and adults--a high throughput microarray analysis.

    Directory of Open Access Journals (Sweden)

    Tamar Ringel-Kulka

    Full Text Available It is generally believed that the infant's microbiota is established during the first 1-2 years of life. However, there is scarce data on its characterization and its comparison to the adult-like microbiota in consecutive years.To characterize and compare the intestinal microbiota in healthy young children (1-4 years and healthy adults from the North Carolina region in the U.S. using high-throughput bacterial phylogenetic microarray analysis.Detailed characterization and comparison of the intestinal microbiota of healthy children aged 1-4 years old (n = 28 and healthy adults of 21-60 years (n = 23 was carried out using the Human Intestinal Tract Chip (HITChip phylogenetic microarray targeting the V1 and V6 regions of 16S rRNA and quantitative PCR.The HITChip microarray data indicate that Actinobacteria, Bacilli, Clostridium cluster IV and Bacteroidetes are the predominant phylum-like groups that exhibit differences between young children and adults. The phylum-like group Clostridium cluster XIVa was equally predominant in young children and adults and is thus considered to be established at an early age. The genus-like level show significant 3.6 fold (higher or lower differences in the abundance of 26 genera between young children and adults. Young U.S. children have a significantly 3.5-fold higher abundance of Bifidobacterium species than the adults from the same location. However, the microbiota of young children is less diverse than that of adults.We show that the establishment of an adult-like intestinal microbiota occurs at a later age than previously reported. Characterizing the microbiota and its development in the early years of life may help identify 'windows of opportunity' for interventional strategies that may promote health and prevent or mitigate disease processes.

  17. Intestinal Microbiota in Healthy U.S. Young Children and Adults—A High Throughput Microarray Analysis

    Science.gov (United States)

    Ringel, Yehuda; Salojärvi, Jarkko; Carroll, Ian; Palva, Airi; de Vos, Willem M.; Satokari, Reetta

    2013-01-01

    It is generally believed that the infant's microbiota is established during the first 1–2 years of life. However, there is scarce data on its characterization and its comparison to the adult-like microbiota in consecutive years. Aim To characterize and compare the intestinal microbiota in healthy young children (1–4 years) and healthy adults from the North Carolina region in the U.S. using high-throughput bacterial phylogenetic microarray analysis. Methods Detailed characterization and comparison of the intestinal microbiota of healthy children aged 1–4 years old (n = 28) and healthy adults of 21–60 years (n = 23) was carried out using the Human Intestinal Tract Chip (HITChip) phylogenetic microarray targeting the V1 and V6 regions of 16S rRNA and quantitative PCR. Results The HITChip microarray data indicate that Actinobacteria, Bacilli, Clostridium cluster IV and Bacteroidetes are the predominant phylum-like groups that exhibit differences between young children and adults. The phylum-like group Clostridium cluster XIVa was equally predominant in young children and adults and is thus considered to be established at an early age. The genus-like level show significant 3.6 fold (higher or lower) differences in the abundance of 26 genera between young children and adults. Young U.S. children have a significantly 3.5-fold higher abundance of Bifidobacterium species than the adults from the same location. However, the microbiota of young children is less diverse than that of adults. Conclusions We show that the establishment of an adult-like intestinal microbiota occurs at a later age than previously reported. Characterizing the microbiota and its development in the early years of life may help identify ‘windows of opportunity’ for interventional strategies that may promote health and prevent or mitigate disease processes. PMID:23717595

  18. Intestinal microbiota in healthy U.S. young children and adults--a high throughput microarray analysis.

    Science.gov (United States)

    Ringel-Kulka, Tamar; Cheng, Jing; Ringel, Yehuda; Salojärvi, Jarkko; Carroll, Ian; Palva, Airi; de Vos, Willem M; Satokari, Reetta

    2013-01-01

    It is generally believed that the infant's microbiota is established during the first 1-2 years of life. However, there is scarce data on its characterization and its comparison to the adult-like microbiota in consecutive years. To characterize and compare the intestinal microbiota in healthy young children (1-4 years) and healthy adults from the North Carolina region in the U.S. using high-throughput bacterial phylogenetic microarray analysis. Detailed characterization and comparison of the intestinal microbiota of healthy children aged 1-4 years old (n = 28) and healthy adults of 21-60 years (n = 23) was carried out using the Human Intestinal Tract Chip (HITChip) phylogenetic microarray targeting the V1 and V6 regions of 16S rRNA and quantitative PCR. The HITChip microarray data indicate that Actinobacteria, Bacilli, Clostridium cluster IV and Bacteroidetes are the predominant phylum-like groups that exhibit differences between young children and adults. The phylum-like group Clostridium cluster XIVa was equally predominant in young children and adults and is thus considered to be established at an early age. The genus-like level show significant 3.6 fold (higher or lower) differences in the abundance of 26 genera between young children and adults. Young U.S. children have a significantly 3.5-fold higher abundance of Bifidobacterium species than the adults from the same location. However, the microbiota of young children is less diverse than that of adults. We show that the establishment of an adult-like intestinal microbiota occurs at a later age than previously reported. Characterizing the microbiota and its development in the early years of life may help identify 'windows of opportunity' for interventional strategies that may promote health and prevent or mitigate disease processes.

  19. High-throughput identification and screening of novel Methylobacterium species using whole-cell MALDI-TOF/MS analysis.

    Science.gov (United States)

    Tani, Akio; Sahin, Nurettin; Matsuyama, Yumiko; Enomoto, Takashi; Nishimura, Naoki; Yokota, Akira; Kimbara, Kazuhide

    2012-01-01

    Methylobacterium species are ubiquitous α-proteobacteria that reside in the phyllosphere and are fed by methanol that is emitted from plants. In this study, we applied whole-cell matrix-assisted laser desorption/ionization time-of-flight mass spectrometry analysis (WC-MS) to evaluate the diversity of Methylobacterium species collected from a variety of plants. The WC-MS spectrum was reproducible through two weeks of cultivation on different media. WC-MS spectrum peaks of M. extorquens strain AM1 cells were attributed to ribosomal proteins, but those were not were also found. We developed a simple method for rapid identification based on spectra similarity. Using all available type strains of Methylobacterium species, the method provided a certain threshold similarity value for species-level discrimination, although the genus contains some type strains that could not be easily discriminated solely by 16S rRNA gene sequence similarity. Next, we evaluated the WC-MS data of approximately 200 methylotrophs isolated from various plants with MALDI Biotyper software (Bruker Daltonics). Isolates representing each cluster were further identified by 16S rRNA gene sequencing. In most cases, the identification by WC-MS matched that by sequencing, and isolates with unique spectra represented possible novel species. The strains belonging to M. extorquens, M. adhaesivum, M. marchantiae, M. komagatae, M. brachiatum, M. radiotolerans, and novel lineages close to M. adhaesivum, many of which were isolated from bryophytes, were found to be the most frequent phyllospheric colonizers. The WC-MS technique provides emerging high-throughputness in the identification of known/novel species of bacteria, enabling the selection of novel species in a library and identification without 16S rRNA gene sequencing.

  20. High-throughput identification and screening of novel Methylobacterium species using whole-cell MALDI-TOF/MS analysis.

    Directory of Open Access Journals (Sweden)

    Akio Tani

    Full Text Available Methylobacterium species are ubiquitous α-proteobacteria that reside in the phyllosphere and are fed by methanol that is emitted from plants. In this study, we applied whole-cell matrix-assisted laser desorption/ionization time-of-flight mass spectrometry analysis (WC-MS to evaluate the diversity of Methylobacterium species collected from a variety of plants. The WC-MS spectrum was reproducible through two weeks of cultivation on different media. WC-MS spectrum peaks of M. extorquens strain AM1 cells were attributed to ribosomal proteins, but those were not were also found. We developed a simple method for rapid identification based on spectra similarity. Using all available type strains of Methylobacterium species, the method provided a certain threshold similarity value for species-level discrimination, although the genus contains some type strains that could not be easily discriminated solely by 16S rRNA gene sequence similarity. Next, we evaluated the WC-MS data of approximately 200 methylotrophs isolated from various plants with MALDI Biotyper software (Bruker Daltonics. Isolates representing each cluster were further identified by 16S rRNA gene sequencing. In most cases, the identification by WC-MS matched that by sequencing, and isolates with unique spectra represented possible novel species. The strains belonging to M. extorquens, M. adhaesivum, M. marchantiae, M. komagatae, M. brachiatum, M. radiotolerans, and novel lineages close to M. adhaesivum, many of which were isolated from bryophytes, were found to be the most frequent phyllospheric colonizers. The WC-MS technique provides emerging high-throughputness in the identification of known/novel species of bacteria, enabling the selection of novel species in a library and identification without 16S rRNA gene sequencing.

  1. MG-RAST version 4-lessons learned from a decade of low-budget ultra-high-throughput metagenome analysis.

    Science.gov (United States)

    Meyer, Folker; Bagchi, Saurabh; Chaterji, Somali; Gerlach, Wolfgang; Grama, Ananth; Harrison, Travis; Paczian, Tobias; Trimble, William L; Wilke, Andreas

    2017-09-26

    As technologies change, MG-RAST is adapting. Newly available software is being included to improve accuracy and performance. As a computational service constantly running large volume scientific workflows, MG-RAST is the right location to perform benchmarking and implement algorithmic or platform improvements, in many cases involving trade-offs between specificity, sensitivity and run-time cost. The work in [Glass EM, Dribinsky Y, Yilmaz P, et al. ISME J 2014;8:1-3] is an example; we use existing well-studied data sets as gold standards representing different environments and different technologies to evaluate any changes to the pipeline. Currently, we use well-understood data sets in MG-RAST as platform for benchmarking. The use of artificial data sets for pipeline performance optimization has not added value, as these data sets are not presenting the same challenges as real-world data sets. In addition, the MG-RAST team welcomes suggestions for improvements of the workflow. We are currently working on versions 4.02 and 4.1, both of which contain significant input from the community and our partners that will enable double barcoding, stronger inferences supported by longer-read technologies, and will increase throughput while maintaining sensitivity by using Diamond and SortMeRNA. On the technical platform side, the MG-RAST team intends to support the Common Workflow Language as a standard to specify bioinformatics workflows, both to facilitate development and efficient high-performance implementation of the community's data analysis tasks. Published by Oxford University Press on behalf of Entomological Society of America 2017. This work is written by US Government employees and is in the public domain in the US.

  2. Metabolomic and high-throughput sequencing analysis – modern approach for the assessment of biodeterioration of materials from historic buildings

    Directory of Open Access Journals (Sweden)

    Beata eGutarowska

    2015-09-01

    Full Text Available Preservation of cultural heritage is of paramount importance worldwide. Microbial colonization of construction materials, such as wood, brick, mortar and stone in historic buildings can lead to severe deterioration. The aim of the present study was to give modern insight into the phylogenetic diversity and activated metabolic pathways of microbial communities colonized historic objects located in the former Auschwitz II-Birkenau concentration and extermination camp in Oświęcim, Poland. For this purpose we combined molecular, microscopic and chemical methods. Selected specimens were examined using Field Emission Scanning Electron Microscopy (FESEM, metabolomic analysis and high-throughput Illumina sequencing. FESEM imaging revealed the presence of complex microbial communities comprising diatoms, fungi and bacteria, mainly cyanobacteria and actinobacteria, on sample surfaces. Microbial diversity of brick specimens appeared higher than that of the wood and was dominated by algae and cyanobacteria, while wood was mainly colonized by fungi. DNA sequences documented the presence of 15 bacterial phyla representing 99 genera including Halomonas, Halorhodospira, Salinisphaera, Salinibacterium, Rubrobacter, Streptomyces, Arthrobacter and 9 fungal classes represented by 113 genera including Cladosporium, Acremonium, Alternaria, Engyodontium, Penicillium, Rhizopus and Aureobasidium. Most of the identified sequences were characteristic of organisms implicated in deterioration of wood and brick. Metabolomic data indicated the activation of numerous metabolic pathways, including those regulating the production of primary and secondary metabolites, for example, metabolites associated with the production of antibiotics, organic acids and deterioration of organic compounds. The study demonstrated that a combination of electron microscopy imaging with metabolomic and genomic techniques allows to link the phylogenetic information and metabolic profiles of

  3. Foundations of data-intensive science: Technology and practice for high throughput, widely distributed, data management and analysis systems

    Science.gov (United States)

    Johnston, William; Ernst, M.; Dart, E.; Tierney, B.

    2014-04-01

    Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large

  4. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  5. Identification of microRNAs in the Toxigenic Dinoflagellate Alexandrium catenella by High-Throughput Illumina Sequencing and Bioinformatic Analysis.

    Directory of Open Access Journals (Sweden)

    Huili Geng

    Full Text Available Micro-ribonucleic acids (miRNAs are a large group of endogenous, tiny, non-coding RNAs consisting of 19-25 nucleotides that regulate gene expression at either the transcriptional or post-transcriptional level by mediating gene silencing in eukaryotes. They are considered to be important regulators that affect growth, development, and response to various stresses in plants. Alexandrium catenella is an important marine toxic phytoplankton species that can cause harmful algal blooms (HABs. To date, identification and function analysis of miRNAs in A. catenella remain largely unexamined. In this study, high-throughput sequencing was performed on A. catenella to identify and quantitatively profile the repertoire of small RNAs from two different growth phases. A total of 38,092,056 and 32,969,156 raw reads were obtained from the two small RNA libraries, respectively. In total, 88 mature miRNAs belonging to 32 miRNA families were identified. Significant differences were found in the member number, expression level of various families, and expression abundance of each member within a family. A total of 15 potentially novel miRNAs were identified. Comparative profiling showed that 12 known miRNAs exhibited differential expression between the lag phase and the logarithmic phase. Real-time quantitative RT-PCR (qPCR was performed to confirm the expression of two differentially expressed miRNAs that were one up-regulated novel miRNA (aca-miR-3p-456915, and one down-regulated conserved miRNA (tae-miR159a. The expression trend of the qPCR assay was generally consistent with the deep sequencing result. Target predictions of the 12 differentially expressed miRNAs resulted in 1813 target genes. Gene ontology (GO analysis and the Kyoto Encyclopedia of Genes and Genomes pathway database (KEGG annotations revealed that some miRNAs were associated with growth and developmental processes of the alga. These results provide insights into the roles that miRNAs play in

  6. Dynamic analysis of slab track on multi-layered transversely isotropic saturated soils subjected to train loads

    Science.gov (United States)

    Zhan, Yongxiang; Yao, Hailin; Lu, Zheng; Yu, Dongming

    2014-12-01

    The dynamic responses of a slab track on transversely isotropic saturated soils subjected to moving train loads are investigated by a semi-analytical approach. The track model is described as an upper Euler beam to simulate the rails and a lower Euler beam to model the slab. Rail pads between the rails and slab are represented by a continuous layer of springs and dashpots. A series of point loads are formulated to describe the moving train loads. The governing equations of track-ground systems are solved using the double Fourier transform, and the dynamic responses in the time domain are obtained by the inverse Fourier transform. The results show that a train load with high velocity will generate a larger response in transversely isotropic saturated soil than the lower velocity load, and special attention should be paid on the pore pressure in the vicinity of the ground surface. The anisotropic parameters of a surface soil layer will have greater influence on the displacement and excess pore water pressure than those of the subsoil layer. The traditional design method taking ground soil as homogeneous isotropic soil is unsafe for the case of RE < 1 and RG < 1, so a transversely isotropic foundation model is of great significance to the design for high train velocities.

  7. IEEE 802.11e (EDCA analysis in the presence of hidden stations

    Directory of Open Access Journals (Sweden)

    Xijie Liu

    2011-07-01

    Full Text Available The key contribution of this paper is the combined analytical analysis of both saturated and non-saturated throughput of IEEE 802.11e networks in the presence of hidden stations. This approach is an extension to earlier works by other authors which provided Markov chain analysis to the IEEE 802.11 family under various assumptions. Our approach also modifies earlier expressions for the probability that a station transmits a packet in a vulnerable period. The numerical results provide the impact of the access categories on the channel throughput. Various throughput results under different mechanisms are presented.

  8. Saturated Zone Colloid Transport

    Energy Technology Data Exchange (ETDEWEB)

    H. S. Viswanathan

    2004-10-07

    This scientific analysis provides retardation factors for colloids transporting in the saturated zone (SZ) and the unsaturated zone (UZ). These retardation factors represent the reversible chemical and physical filtration of colloids in the SZ. The value of the colloid retardation factor, R{sub col} is dependent on several factors, such as colloid size, colloid type, and geochemical conditions (e.g., pH, Eh, and ionic strength). These factors are folded into the distributions of R{sub col} that have been developed from field and experimental data collected under varying geochemical conditions with different colloid types and sizes. Attachment rate constants, k{sub att}, and detachment rate constants, k{sub det}, of colloids to the fracture surface have been measured for the fractured volcanics, and separate R{sub col} uncertainty distributions have been developed for attachment and detachment to clastic material and mineral grains in the alluvium. Radionuclides such as plutonium and americium sorb mostly (90 to 99 percent) irreversibly to colloids (BSC 2004 [DIRS 170025], Section 6.3.3.2). The colloid retardation factors developed in this analysis are needed to simulate the transport of radionuclides that are irreversibly sorbed onto colloids; this transport is discussed in the model report ''Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]). Although it is not exclusive to any particular radionuclide release scenario, this scientific analysis especially addresses those scenarios pertaining to evidence from waste-degradation experiments, which indicate that plutonium and americium may be irreversibly attached to colloids for the time scales of interest. A section of this report will also discuss the validity of using microspheres as analogs to colloids in some of the lab and field experiments used to obtain the colloid retardation factors. In addition, a small fraction of colloids travels with the groundwater without any significant

  9. Healthcare Costs Associated with an Adequate Intake of Sugars, Salt and Saturated Fat in Germany: A Health Econometrical Analysis.

    Directory of Open Access Journals (Sweden)

    Toni Meier

    Full Text Available Non-communicable diseases (NCDs represent not only the major driver for quality-restricted and lost life years; NCDs and their related medical treatment costs also pose a substantial economic burden on healthcare and intra-generational tax distribution systems. The main objective of this study was therefore to quantify the economic burden of unbalanced nutrition in Germany--in particular the effects of an excessive consumption of fat, salt and sugar--and to examine different reduction scenarios on this basis. In this study, the avoidable direct cost savings in the German healthcare system attributable to an adequate intake of saturated fatty acids (SFA, salt and sugar (mono- & disaccharides, MDS were calculated. To this end, disease-specific healthcare cost data from the official Federal Health Monitoring for the years 2002-2008 and disease-related risk factors, obtained by thoroughly searching the literature, were used. A total of 22 clinical endpoints with 48 risk-outcome pairs were considered. Direct healthcare costs attributable to an unbalanced intake of fat, salt and sugar are calculated to be 16.8 billion EUR (CI95%: 6.3-24.1 billion EUR in the year 2008, which represents 7% (CI95% 2%-10% of the total treatment costs in Germany (254 billion EUR. This is equal to 205 EUR per person annually. The excessive consumption of sugar poses the highest burden, at 8.6 billion EUR (CI95%: 3.0-12.1; salt ranks 2nd at 5.3 billion EUR (CI95%: 3.2-7.3 and saturated fat ranks 3rd at 2.9 billion EUR (CI95%: 32 million-4.7 billion. Predicted direct healthcare cost savings by means of a balanced intake of sugars, salt and saturated fat are substantial. However, as this study solely considered direct medical treatment costs regarding an adequate consumption of fat, salt and sugars, the actual societal and economic gains, resulting both from direct and indirect cost savings, may easily exceed 16.8 billion EUR.

  10. Healthcare Costs Associated with an Adequate Intake of Sugars, Salt and Saturated Fat in Germany: A Health Econometrical Analysis.

    Science.gov (United States)

    Meier, Toni; Senftleben, Karolin; Deumelandt, Peter; Christen, Olaf; Riedel, Katja; Langer, Martin

    2015-01-01

    Non-communicable diseases (NCDs) represent not only the major driver for quality-restricted and lost life years; NCDs and their related medical treatment costs also pose a substantial economic burden on healthcare and intra-generational tax distribution systems. The main objective of this study was therefore to quantify the economic burden of unbalanced nutrition in Germany--in particular the effects of an excessive consumption of fat, salt and sugar--and to examine different reduction scenarios on this basis. In this study, the avoidable direct cost savings in the German healthcare system attributable to an adequate intake of saturated fatty acids (SFA), salt and sugar (mono- & disaccharides, MDS) were calculated. To this end, disease-specific healthcare cost data from the official Federal Health Monitoring for the years 2002-2008 and disease-related risk factors, obtained by thoroughly searching the literature, were used. A total of 22 clinical endpoints with 48 risk-outcome pairs were considered. Direct healthcare costs attributable to an unbalanced intake of fat, salt and sugar are calculated to be 16.8 billion EUR (CI95%: 6.3-24.1 billion EUR) in the year 2008, which represents 7% (CI95% 2%-10%) of the total treatment costs in Germany (254 billion EUR). This is equal to 205 EUR per person annually. The excessive consumption of sugar poses the highest burden, at 8.6 billion EUR (CI95%: 3.0-12.1); salt ranks 2nd at 5.3 billion EUR (CI95%: 3.2-7.3) and saturated fat ranks 3rd at 2.9 billion EUR (CI95%: 32 million-4.7 billion). Predicted direct healthcare cost savings by means of a balanced intake of sugars, salt and saturated fat are substantial. However, as this study solely considered direct medical treatment costs regarding an adequate consumption of fat, salt and sugars, the actual societal and economic gains, resulting both from direct and indirect cost savings, may easily exceed 16.8 billion EUR.

  11. Leveraging the power of high performance computing for next generation sequencing data analysis: tricks and twists from a high throughput exome workflow.

    Directory of Open Access Journals (Sweden)

    Amit Kawalia

    Full Text Available Next generation sequencing (NGS has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files.

  12. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  13. Leveraging the power of high performance computing for next generation sequencing data analysis: tricks and twists from a high throughput exome workflow.

    Science.gov (United States)

    Kawalia, Amit; Motameny, Susanne; Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files.

  14. Metagenomic analysis of taxa associated with Lutzomyia longipalpis, vector of visceral leishmaniasis, using an unbiased high-throughput approach.

    Directory of Open Access Journals (Sweden)

    Christina B McCarthy

    2011-09-01

    Full Text Available BACKGROUND: Leishmaniasis is one of the most diverse and complex of all vector-borne diseases worldwide. It is caused by parasites of the genus Leishmania, obligate intramacrophage protists characterised by diversity and complexity. Its most severe form is visceral leishmaniasis (VL, a systemic disease that is fatal if left untreated. In Latin America VL is caused by Leishmania infantum chagasi and transmitted by Lutzomyia longipalpis. This phlebotomine sandfly is only found in the New World, from Mexico to Argentina. In South America, migration and urbanisation have largely contributed to the increase of VL as a public health problem. Moreover, the first VL outbreak was recently reported in Argentina, which has already caused 7 deaths and 83 reported cases. METHODOLOGY/PRINCIPAL FINDINGS: An inventory of the microbiota associated with insect vectors, especially of wild specimens, would aid in the development of novel strategies for controlling insect vectors. Given the recent VL outbreak in Argentina and the compelling need to develop appropriate control strategies, this study focused on wild male and female Lu. longipalpis from an Argentine endemic (Posadas, Misiones and a Brazilian non-endemic (Lapinha Cave, Minas Gerais VL location. Previous studies on wild and laboratory reared female Lu. longipalpis have described gut bacteria using standard bacteriological methods. In this study, total RNA was extracted from the insects and submitted to high-throughput pyrosequencing. The analysis revealed the presence of sequences from bacteria, fungi, protist parasites, plants and metazoans. CONCLUSIONS/SIGNIFICANCE: This is the first time an unbiased and comprehensive metagenomic approach has been used to survey taxa associated with an infectious disease vector. The identification of gregarines suggested they are a possible efficient control method under natural conditions. Ongoing studies are determining the significance of the associated taxa found

  15. A simple dual online ultra-high pressure liquid chromatography system (sDO-UHPLC) for high throughput proteome analysis.

    Science.gov (United States)

    Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won

    2015-08-21

    We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.

  16. Reconstruction of metabolic networks from high-throughput metabolite profiling data: in silico analysis of red blood cell metabolism

    OpenAIRE

    Nemenman, Ilya; Escola, G. Sean; Hlavacek, William S.; Unkefer, Pat J.; Unkefer, Clifford J.; Wall, Michael E.

    2007-01-01

    We investigate the ability of algorithms developed for reverse engineering of transcriptional regulatory networks to reconstruct metabolic networks from high-throughput metabolite profiling data. For this, we generate synthetic metabolic profiles for benchmarking purposes based on a well-established model for red blood cell metabolism. A variety of data sets is generated, accounting for different properties of real metabolic networks, such as experimental noise, metabolite correlations, and t...

  17. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  18. MAPPI-DAT: data management and analysis for protein-protein interaction data from the high-throughput MAPPIT cell microarray platform.

    Science.gov (United States)

    Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart

    2017-05-01

    Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction-Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments. MAPPI-DAT is developed in Python, using R for data analysis and MySQL as data management system. MAPPI-DAT is cross-platform and can be ran on Microsoft Windows, Linux and OS X/macOS. The source code and a Microsoft Windows executable are freely available under the permissive Apache2 open source license at https://github.com/compomics/MAPPI-DAT. jan.tavernier@vib-ugent.be or lennart.martens@vib-ugent.be. Supplementary data are available at Bioinformatics online.

  19. Analysis of nuclear organization with TANGO, software for high-throughput quantitative analysis of 3D fluorescence microscopy images.

    Science.gov (United States)

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2015-01-01

    The cell nucleus is a highly organized cellular organelle that contains the genome. An important step to understand the relationships between genome positioning and genome functions is to extract quantitative data from three-dimensional (3D) fluorescence imaging. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here we present a practical approach using TANGO (Tools for Analysis of Nuclear Genome Organization), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a generic tool able to process large sets of images, allowing quantitative study of nuclear organization. In this chapter a practical description of the software is drawn in order to give an overview of its different concepts and functionalities. This description is illustrated with a precise example that can be performed step-by-step on experimental data provided on the website http://biophysique.mnhn.fr/tango/HomePage.

  20. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  1. Product Chemistry and Process Efficiency of Biomass Torrefaction, Pyrolysis and Gasification Studied by High-Throughput Techniques and Multivariate Analysis

    Science.gov (United States)

    Xiao, Li

    Despite the great passion and endless efforts on development of renewable energy from biomass, the commercialization and scale up of biofuel production is still under pressure and facing challenges. New ideas and facilities are being tested around the world targeting at reducing cost and improving product value. Cutting edge technologies involving analytical chemistry, statistics analysis, industrial engineering, computer simulation, and mathematics modeling, etc. keep integrating modern elements into this classic research. One of those challenges of commercializing biofuel production is the complexity from chemical composition of biomass feedstock and the products. Because of this, feedstock selection and process optimization cannot be conducted efficiently. This dissertation attempts to further evaluate biomass thermal decomposition process using both traditional methods and advanced technique (Pyrolysis Molecular Beam Mass Spectrometry). Focus has been made on data base generation of thermal decomposition products from biomass at different temperatures, finding out the relationship between traditional methods and advanced techniques, evaluating process efficiency and optimizing reaction conditions, comparison of typically utilized biomass feedstock and new search on innovative species for economical viable feedstock preparation concepts, etc. Lab scale quartz tube reactors and 80il stainless steel sample cups coupled with auto-sampling system were utilized to simulate the complicated reactions happened in real fluidized or entrained flow reactors. Two main high throughput analytical techniques used are Near Infrared Spectroscopy (NIR) and Pyrolysis Molecular Beam Mass Spectrometry (Py-MBMS). Mass balance, carbon balance, and product distribution are presented in detail. Variations of thermal decomposition temperature range from 200°C to 950°C. Feedstocks used in the study involve typical hardwood and softwood (red oak, white oak, yellow poplar, loblolly pine

  2. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    2016-01-01

    Biogas production is an economically attractive technology that has gained momentum worldwide over the past years. Biogas is produced by a biologically mediated process, widely known as "anaerobic digestion." This process is performed by a specialized and complex microbial community, in which...... dissect the bioma involved in anaerobic digestion by means of high throughput Illumina sequencing (~51 gigabases of sequence data), disclosing nearly one million genes and extracting 106 microbial genomes by a novel strategy combining two binning processes. Microbial phylogeny and putative taxonomy...

  3. Multiplexed ChIP-Seq Using Direct Nucleosome Barcoding: A Tool for High-Throughput Chromatin Analysis.

    Science.gov (United States)

    Chabbert, Christophe D; Adjalley, Sophie H; Steinmetz, Lars M; Pelechano, Vicent

    2018-01-01

    Chromatin immunoprecipitation followed by sequencing (ChIP-Seq) or microarray hybridization (ChIP-on-chip) are standard methods for the study of transcription factor binding sites and histone chemical modifications. However, these approaches only allow profiling of a single factor or protein modification at a time.In this chapter, we present Bar-ChIP, a higher throughput version of ChIP-Seq that relies on the direct ligation of molecular barcodes to chromatin fragments. Bar-ChIP enables the concurrent profiling of multiple DNA-protein interactions and is therefore amenable to experimental scale-up, without the need for any robotic instrumentation.

  4. The transport behaviour of elemental mercury DNAPL in saturated porous media: analysis of field observations and two-phase flow modelling.

    Science.gov (United States)

    Sweijen, Thomas; Hartog, Niels; Marsman, Annemieke; Keijzer, Thomas J S

    2014-06-01

    Mercury is a contaminant of global concern. The use of elemental mercury in various (former) industrial processes, such as chlorine production at chlor-alkali plants, is known to have resulted in soil and groundwater contaminations worldwide. However, the subsurface transport behaviour of elemental mercury as an immiscible dense non-aqueous phase liquid (DNAPL) in porous media has received minimal attention to date. Even though, such insight would aid in the remediation effort of mercury contaminated sites. Therefore, in this study a detailed field characterization of elemental mercury DNAPL distribution with depth was performed together with two-phase flow modelling, using STOMP. This is to evaluate the dynamics of mercury DNAPL migration and the controls on its distribution in saturated porous media. Using a CPT-probe mounted with a digital camera, in-situ mercury DNAPL depth distribution was obtained at a former chlor-alkali-plant, down to 9 m below ground surface. Images revealing the presence of silvery mercury DNAPL droplets were used to quantify its distribution, characteristics and saturation, using an image analysis method. These field-observations with depth were compared with results from a one-dimensional two-phase flow model simulation for the same transect. Considering the limitations of this approach, simulations reasonably reflected the variability and range of the mercury DNAPL distribution. To further explore the impact of mercury's physical properties in comparison with more common DNAPLs, the migration of mercury and PCE DNAPL in several typical hydrological scenarios was simulated. Comparison of the simulations suggest that mercury's higher density is the overall controlling factor in controlling its penetration in saturated porous media, despite its higher resistance to flow due to its higher viscosity. Based on these results the hazard of spilled mercury DNAPL to cause deep contamination of groundwater systems seems larger than for any other

  5. The transport behaviour of elemental mercury DNAPL in saturated porous media: Analysis of field observations and two-phase flow modelling

    Science.gov (United States)

    Sweijen, Thomas; Hartog, Niels; Marsman, Annemieke; Keijzer, Thomas J. S.

    2014-06-01

    Mercury is a contaminant of global concern. The use of elemental mercury in various (former) industrial processes, such as chlorine production at chlor-alkali plants, is known to have resulted in soil and groundwater contaminations worldwide. However, the subsurface transport behaviour of elemental mercury as an immiscible dense non-aqueous phase liquid (DNAPL) in porous media has received minimal attention to date. Even though, such insight would aid in the remediation effort of mercury contaminated sites. Therefore, in this study a detailed field characterization of elemental mercury DNAPL distribution with depth was performed together with two-phase flow modelling, using STOMP. This is to evaluate the dynamics of mercury DNAPL migration and the controls on its distribution in saturated porous media. Using a CPT-probe mounted with a digital camera, in-situ mercury DNAPL depth distribution was obtained at a former chlor-alkali-plant, down to 9 m below ground surface. Images revealing the presence of silvery mercury DNAPL droplets were used to quantify its distribution, characteristics and saturation, using an image analysis method. These field-observations with depth were compared with results from a one-dimensional two-phase flow model simulation for the same transect. Considering the limitations of this approach, simulations reasonably reflected the variability and range of the mercury DNAPL distribution. To further explore the impact of mercury's physical properties in comparison with more common DNAPLs, the migration of mercury and PCE DNAPL in several typical hydrological scenarios was simulated. Comparison of the simulations suggest that mercury's higher density is the overall controlling factor in controlling its penetration in saturated porous media, despite its higher resistance to flow due to its higher viscosity. Based on these results the hazard of spilled mercury DNAPL to cause deep contamination of groundwater systems seems larger than for any other

  6. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    Science.gov (United States)

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  7. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  8. Analysis of the laminar Newtonian fluid flow through a thin fracture modelled as a fluid-saturated sparsely packed porous medium

    Energy Technology Data Exchange (ETDEWEB)

    Pazanin, Igor [Zagreb Univ. (Croatia). Dept. of Mathematics; Siddheshwar, Pradeep G. [Bangalore Univ., Bengaluru (India). Dept. of Mathematics

    2017-06-01

    In this article we investigate the fluid flow through a thin fracture modelled as a fluid-saturated porous medium. We assume that the fracture has constrictions and that the flow is governed by the prescribed pressure drop between the edges of the fracture. The problem is described by the Darcy-Lapwood-Brinkman model acknowledging the Brinkman extension of the Darcy law as well as the flow inertia. Using asymptotic analysis with respect to the thickness of the fracture, we derive the explicit higher-order approximation for the velocity distribution. We make an error analysis to comment on the order of accuracy of the method used and also to provide rigorous justification for the model.

  9. High-throughput two-dimensional root system phenotyping platform facilitates genetic analysis of root growth and development.

    Science.gov (United States)

    Clark, Randy T; Famoso, Adam N; Zhao, Keyan; Shaff, Jon E; Craft, Eric J; Bustamante, Carlos D; McCouch, Susan R; Aneshansley, Daniel J; Kochian, Leon V

    2013-02-01

    High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyse these images. The platform and its components are adaptable to a wide range root phenotyping studies using diverse growth systems (hydroponics, paper pouches, gel and soil) involving several plant species, including, but not limited to, rice, maize, sorghum, tomato and Arabidopsis. The RootReader2D software tool is free and publicly available and was designed with both user-guided and automated features that increase flexibility and enhance efficiency when measuring root growth traits from specific roots or entire root systems during large-scale phenotyping studies. To demonstrate the unique capabilities and high-throughput capacity of this phenotyping platform for studying root systems, genome-wide association studies on rice (Oryza sativa) and maize (Zea mays) root growth were performed and root traits related to aluminium (Al) tolerance were analysed on the parents of the maize nested association mapping (NAM) population. © 2012 Blackwell Publishing Ltd.

  10. Human Genome Sequencing at the Population Scale: A Primer on High-Throughput DNA Sequencing and Analysis.

    Science.gov (United States)

    Goldfeder, Rachel L; Wall, Dennis P; Khoury, Muin J; Ioannidis, John P A; Ashley, Euan A

    2017-10-15

    Most human diseases have underlying genetic causes. To better understand the impact of genes on disease and its implications for medicine and public health, researchers have pursued methods for determining the sequences of individual genes, then all genes, and now complete human genomes. Massively parallel high-throughput sequencing technology, where DNA is sheared into smaller pieces, sequenced, and then computationally reordered and analyzed, enables fast and affordable sequencing of full human genomes. As the price of sequencing continues to decline, more and more individuals are having their genomes sequenced. This may facilitate better population-level disease subtyping and characterization, as well as individual-level diagnosis and personalized treatment and prevention plans. In this review, we describe several massively parallel high-throughput DNA sequencing technologies and their associated strengths, limitations, and error modes, with a focus on applications in epidemiologic research and precision medicine. We detail the methods used to computationally process and interpret sequence data to inform medical or preventative action. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  12. Dissecting the Phenotypic Components of Crop Plant Growth and Drought Responses Based on High-Throughput Image Analysis[W][OPEN

    Science.gov (United States)

    Chen, Dijun; Neumann, Kerstin; Friedel, Swetlana; Kilian, Benjamin; Chen, Ming; Altmann, Thomas; Klukas, Christian

    2014-01-01

    Significantly improved crop varieties are urgently needed to feed the rapidly growing human population under changing climates. While genome sequence information and excellent genomic tools are in place for major crop species, the systematic quantification of phenotypic traits or components thereof in a high-throughput fashion remains an enormous challenge. In order to help bridge the genotype to phenotype gap, we developed a comprehensive framework for high-throughput phenotype data analysis in plants, which enables the extraction of an extensive list of phenotypic traits from nondestructive plant imaging over time. As a proof of concept, we investigated the phenotypic components of the drought responses of 18 different barley (Hordeum vulgare) cultivars during vegetative growth. We analyzed dynamic properties of trait expression over growth time based on 54 representative phenotypic features. The data are highly valuable to understand plant development and to further quantify growth and crop performance features. We tested various growth models to predict plant biomass accumulation and identified several relevant parameters that support biological interpretation of plant growth and stress tolerance. These image-based traits and model-derived parameters are promising for subsequent genetic mapping to uncover the genetic basis of complex agronomic traits. Taken together, we anticipate that the analytical framework and analysis results presented here will be useful to advance our views of phenotypic trait components underlying plant development and their responses to environmental cues. PMID:25501589

  13. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    Science.gov (United States)

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  14. The application of a high throughput analysis method for the screening of potential biosurfactants from natural sources.

    Science.gov (United States)

    Chen, Chien-Yen; Baker, Simon C; Darton, Richard C

    2007-09-01

    The presence of biosurfactants in growth media can be evaluated by a variety of methods, none of which are suitable for high throughput studies. The method described here is based on the effect of meniscus shape on the image of a grid viewed through the wells of a 96-well plate. The efficacy of the method was demonstrated by the selection of a bacterium (producing a biosurfactant able to reduce the surface tension of pure water from 72 to 28.75 mN m(-1)) from a culture collection isolated from aviation fuel-contaminated land. The assay was found to be more sensitive, rapid and easy to perform than other published methods. It does not need specialised equipment or chemicals and excludes the bias which results from the surfactant properties of medium used for bacterial growth.

  15. Reconstruction of metabolic networks from high-throughput metabolite profiling data: in silico analysis of red blood cell metabolism.

    Science.gov (United States)

    Nemenman, Ilya; Escola, G Sean; Hlavacek, William S; Unkefer, Pat J; Unkefer, Clifford J; Wall, Michael E

    2007-12-01

    We investigate the ability of algorithms developed for reverse engineering of transcriptional regulatory networks to reconstruct metabolic networks from high-throughput metabolite profiling data. For benchmarking purposes, we generate synthetic metabolic profiles based on a well-established model for red blood cell metabolism. A variety of data sets are generated, accounting for different properties of real metabolic networks, such as experimental noise, metabolite correlations, and temporal dynamics. These data sets are made available online. We use ARACNE, a mainstream algorithm for reverse engineering of transcriptional regulatory networks from gene expression data, to predict metabolic interactions from these data sets. We find that the performance of ARACNE on metabolic data is comparable to that on gene expression data.

  16. Leishmania genome analysis and high-throughput immunological screening identifies tuzin as a novel vaccine candidate against visceral leishmaniasis.

    Science.gov (United States)

    Lakshmi, Bhavana Sethu; Wang, Ruobing; Madhubala, Rentala

    2014-06-24

    Leishmaniasis is a neglected tropical disease caused by Leishmania species. It is a major health concern affecting 88 countries and threatening 350 million people globally. Unfortunately, there are no vaccines and there are limitations associated with the current therapeutic regimens for leishmaniasis. The emerging cases of drug-resistance further aggravate the situation, demanding rapid drug and vaccine development. The genome sequence of Leishmania, provides access to novel genes that hold potential as chemotherapeutic targets or vaccine candidates. In this study, we selected 19 antigenic genes from about 8000 common Leishmania genes based on the Leishmania major and Leishmania infantum genome information available in the pathogen databases. Potential vaccine candidates thus identified were screened using an in vitro high throughput immunological platform developed in the laboratory. Four candidate genes coding for tuzin, flagellar glycoprotein-like protein (FGP), phospholipase A1-like protein (PLA1) and potassium voltage-gated channel protein (K VOLT) showed a predominant protective Th1 response over disease exacerbating Th2. We report the immunogenic properties and protective efficacy of one of the four antigens, tuzin, as a DNA vaccine against Leishmania donovani challenge. Our results show that administration of tuzin DNA protected BALB/c mice against L. donovani challenge and that protective immunity was associated with higher levels of IFN-γ and IL-12 production in comparison to IL-4 and IL-10. Our study presents a simple approach to rapidly identify potential vaccine candidates using the exhaustive information stored in the genome and an in vitro high-throughput immunological platform. Copyright © 2014. Published by Elsevier Ltd.

  17. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  18. Large-scale tracking and classification for automatic analysis of cell migration and proliferation, and experimental optimization of high-throughput screens of neuroblastoma cells.

    Science.gov (United States)

    Harder, Nathalie; Batra, Richa; Diessl, Nicolle; Gogolin, Sina; Eils, Roland; Westermann, Frank; König, Rainer; Rohr, Karl

    2015-06-01

    Computational approaches for automatic analysis of image-based high-throughput and high-content screens are gaining increased importance to cope with the large amounts of data generated by automated microscopy systems. Typically, automatic image analysis is used to extract phenotypic information once all images of a screen have been acquired. However, also in earlier stages of large-scale experiments image analysis is important, in particular, to support and accelerate the tedious and time-consuming optimization of the experimental conditions and technical settings. We here present a novel approach for automatic, large-scale analysis and experimental optimization with application to a screen on neuroblastoma cell lines. Our approach consists of cell segmentation, tracking, feature extraction, classification, and model-based error correction. The approach can be used for experimental optimization by extracting quantitative information which allows experimentalists to optimally choose and to verify the experimental parameters. This involves systematically studying the global cell movement and proliferation behavior. Moreover, we performed a comprehensive phenotypic analysis of a large-scale neuroblastoma screen including the detection of rare division events such as multi-polar divisions. Major challenges of the analyzed high-throughput data are the relatively low spatio-temporal resolution in conjunction with densely growing cells as well as the high variability of the data. To account for the data variability we optimized feature extraction and classification, and introduced a gray value normalization technique as well as a novel approach for automatic model-based correction of classification errors. In total, we analyzed 4,400 real image sequences, covering observation periods of around 120 h each. We performed an extensive quantitative evaluation, which showed that our approach yields high accuracies of 92.2% for segmentation, 98.2% for tracking, and 86.5% for

  19. Combination of amplified rDNA restriction analysis and high-throughput sequencing revealed the negative effect of colistin sulfate on the diversity of soil microorganisms.

    Science.gov (United States)

    Fan, Tingli; Sun, Yongxue; Peng, Jinju; Wu, Qun; Ma, Yi; Zhou, Xiaohui

    2018-01-01

    Colistin sulfate is widely used in both human and veterinary medicine. However, its effect on the microbial ecologyis unknown. In this study, we determined the effect of colistin sulfate on the diversity of soil microorganisms by amplified rDNA restriction analysis (ARDRA) and high-throughput sequencing.ARDRAshowed that the diversity of DNA from soil microorganisms was reduced after soil was treated with colistin sulfate, with the most dramatic reductionobserved after 35days of treatment. High-throughput sequencing showed that the Chao1 and abundance-based coverage estimators (ACE) were reduced in the soils treated with colistin sulfate for 35 dayscompared to those treated with colistin sulfate for 7days. Furthermore, Chao1 and ACE tended to be lower when higher concentration of colistin sulfate was used, suggesting that the microbial abundance is reduced by colistin sulfate in a dose-dependent manner. Shannon index showed that the diversity of soil microorganism was reduced upon treatment with colistin sulfate compared to the untreated control group. Following 7days of treatment, Bacillus, Clostridiumand Sphingomonas were sensitive to all the concentration of colistin sulfate used in this study. Following 35days of treatment, the abundance of Choroplast, Haliangium, Pseudomonas, Lactococcus, and Clostridium was significantly decreased. Our results demonstrated that colistin sulfate especially at high concentration (≥5mg/kg) could alter the population structure of microorganisms and consequently the microbial community function in soil. Copyright © 2017 Elsevier GmbH. All rights reserved.

  20. Parallel thermal analysis technology using an infrared camera for high-throughput evaluation of active pharmaceutical ingredients: a case study of melting point determination.

    Science.gov (United States)

    Kawakami, Kohsaku

    2010-09-01

    Various techniques for physical characterization of active pharmaceutical ingredients, including X-ray powder diffraction, birefringence observation, Raman spectroscopy, and high-performance liquid chromatography, can be conducted using 96-well plates. The only exception among the important characterization items is the thermal analysis, which can be a limiting step in many cases, notably when screening the crystal/salt form. In this study, infrared thermal camera technology was applied for thermal characterization of pharmaceutical compounds. The melting temperature of model compounds was determined typically within 5 min, and the obtained melting temperature values agreed well with those from differential scanning calorimetry measurements. Since many compounds can be investigated simultaneously in this infrared technology, it should be promising for high-throughput thermal analysis in the pharmaceutical developmental process.

  1. Transcriptome-Wide Analysis of Botrytis elliptica Responsive microRNAs and Their Targets in Lilium Regale Wilson by High-Throughput Sequencing and Degradome Analysis

    Directory of Open Access Journals (Sweden)

    Xue Gao

    2017-05-01

    Full Text Available MicroRNAs, as master regulators of gene expression, have been widely identified and play crucial roles in plant-pathogen interactions. A fatal pathogen, Botrytis elliptica, causes the serious folia disease of lily, which reduces production because of the high susceptibility of most cultivated species. However, the miRNAs related to Botrytis infection of lily, and the miRNA-mediated gene regulatory networks providing resistance to B. elliptica in lily remain largely unexplored. To systematically dissect B. elliptica-responsive miRNAs and their target genes, three small RNA libraries were constructed from the leaves of Lilium regale, a promising Chinese wild Lilium species, which had been subjected to mock B. elliptica treatment or B. elliptica infection for 6 and 24 h. By high-throughput sequencing, 71 known miRNAs belonging to 47 conserved families and 24 novel miRNA were identified, of which 18 miRNAs were downreguleted and 13 were upregulated in response to B. elliptica. Moreover, based on the lily mRNA transcriptome, 22 targets for 9 known and 1 novel miRNAs were identified by the degradome sequencing approach. Most target genes for elliptica-responsive miRNAs were involved in metabolic processes, few encoding different transcription factors, including ELONGATION FACTOR 1 ALPHA (EF1a and TEOSINTE BRANCHED1/CYCLOIDEA/PROLIFERATING CELL FACTOR 2 (TCP2. Furthermore, the expression patterns of a set of elliptica-responsive miRNAs and their targets were validated by quantitative real-time PCR. This study represents the first transcriptome-based analysis of miRNAs responsive to B. elliptica and their targets in lily. The results reveal the possible regulatory roles of miRNAs and their targets in B. elliptica interaction, which will extend our understanding of the mechanisms of this disease in lily.

  2. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  3. Metallurgical analysis and nanoindentation characterization of Ti-6Al-4V workpiece and chips in high-throughput drilling

    Energy Technology Data Exchange (ETDEWEB)

    Li Rui [Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States); Riester, Laura; Watkins, Thomas R.; Blau, Peter J. [Materials Science and Technology Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Shih, Albert J. [Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States)], E-mail: shiha@umich.edu

    2008-01-15

    The metallurgical analyses, including scanning electron microscopy (SEM), X-ray diffraction (XRD), electron microprobe, and nanoindentation characterization are conducted to study the Ti-6Al-4V hole surface and subsurface and the chips in high-throughput drilling tests. The influence of high temperature, large strain, and high strain rate deformation on the {beta} {yields} {alpha} phase transformation and mechanical properties is investigated. Diffusionless {beta} {yields} {alpha} phase transformation in the subsurface layer adjacent to the hole surface can be observed in dry drilling, but not in other drilling conditions with the supply of cutting fluid. Nanoindentation tests identify a 15-20 {mu}m high hardness subsurface layer with peak hardness over 9 GPa, relative to the 4-5 GPa bulk material hardness, adjacent to the hole surface in dry drilling. For drilling chips, the {beta} phase is retained under all conditions tested due to rapid cooling. On the chips, the saw-tooth feature and narrow shear bands are only formed at the outmost edge and no significant change of hardness across the shear bands can be found in nanoindentation.

  4. Metallurgical Analysis and Nanoindentaiton Characterization of Ti-6Al-4V Workpiece and Chips in High-throughput drilling

    Energy Technology Data Exchange (ETDEWEB)

    Li, Rui [ORNL; Riester, Laura [ORNL; Watkins, Thomas R [ORNL; Blau, Peter Julian [ORNL; Shih, Albert J. [University of Michigan

    2008-01-01

    The metallurgical analyses, including scanning electron microscopy (SEM), X-ray diffraction (XRD), electron microprobe, and nanoindentation characterization are conducted to study the Ti-6Al-4V hole surface and subsurface and the chips in high-throughput drilling tests. The influence of high temperature, large strain, and high strain rate deformation on the {beta}-{alpha} phase transformation and mechanical properties is investigated. Diffusionless {beta}-{alpha} phase transformation in the subsurface layer adjacent to the hole surface can be observed in dry drilling, but not in other drilling conditions with the supply of cutting fluid. Nanoindentation tests identify a 15-20 {micro}m high hardness subsurface layer with peak hardness over 9 GPa, relative to the 4-5 GPa bulk material hardness, adjacent to the hole surface in dry drilling. For drilling chips, the {beta} phase is retained under all conditions tested due to rapid cooling. On the chips, the saw-tooth feature and narrow shear bands are only formed at the outmost edge and no significant change of hardness across the shear bands can be found in nanoindentation.

  5. Automated, high-throughput, motility analysis in Caenorhabditis elegans and parasitic nematodes: Applications in the search for new anthelmintics

    Directory of Open Access Journals (Sweden)

    Steven D. Buckingham

    2014-12-01

    Full Text Available The scale of the damage worldwide to human health, animal health and agricultural crops resulting from parasitic nematodes, together with the paucity of treatments and the threat of developing resistance to the limited set of widely-deployed chemical tools, underlines the urgent need to develop novel drugs and chemicals to control nematode parasites. Robust chemical screens which can be automated are a key part of that discovery process. Hitherto, the successful automation of nematode behaviours has been a bottleneck in the chemical discovery process. As the measurement of nematode motility can provide a direct scalar readout of the activity of the neuromuscular system and an indirect measure of the health of the animal, this omission is acute. Motility offers a useful assay for high-throughput, phenotypic drug/chemical screening and several recent developments have helped realise, at least in part, the potential of nematode-based drug screening. Here we review the challenges encountered in automating nematode motility and some important developments in the application of machine vision, statistical imaging and tracking approaches which enable the automated characterisation of nematode movement. Such developments facilitate automated screening for new drugs and chemicals aimed at controlling human and animal nematode parasites (anthelmintics and plant nematode parasites (nematicides.

  6. Rapid high-throughput analysis of ochratoxin A by the self-assembly of DNAzyme-aptamer conjugates in wine.

    Science.gov (United States)

    Yang, Cheng; Lates, Vasilica; Prieto-Simón, Beatriz; Marty, Jean-Louis; Yang, Xiurong

    2013-11-15

    We report a new label-free colorimetric aptasensor based on DNAzyme-aptamer conjugate for rapid and high-throughput detection of Ochratoxin A (OTA, a possible human carcinogen, group 2B) in wine. Two oligonucleotides were designed for this detection. One is N1 for biorecognition, which includes two adjacent sequences: the OTA-specific aptamer sequence and the horseradish peroxidase (HRP)-mimicking DNAzyme sequence. The other is a blocking DNA (B2), which is partially complementary to a part of the OTA aptamer and partially complementary to a part of the DNAzyme. The existence of OTA reduces the hybridization between N1 and B2. Thus, the activity of the non-hybridized DNAzyme is linearly correlated with the concentration of OTA up to 30 nM with a limit of detection of 4 nM (3σ). Meanwhile, a double liquid-liquid extraction (LLE) method is accordingly developed to purify OTA from wine. Compared with the existing HPLC-FD or immunoassay methods, the proposed strategy presents the most appropriate balance between accuracy and facility, resulting in a considerable improvement of real-time quality control, and thereby, preventing chronic poisoning caused by OTA contained red wine. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Taxonomic analysis of the microbial community in stored sugar beets using high-throughput sequencing of different marker genes.

    Science.gov (United States)

    Liebe, Sebastian; Wibberg, Daniel; Winkler, Anika; Pühler, Alfred; Schlüter, Andreas; Varrelmann, Mark

    2016-02-01

    Post-harvest colonization of sugar beets accompanied by rot development is a serious problem due to sugar losses and negative impact on processing quality. Studies on the microbial community associated with rot development and factors shaping their structure are missing. Therefore, high-throughput sequencing was applied to describe the influence of environment, plant genotype and storage temperature (8°C and 20°C) on three different communities in stored sugar beets, namely fungi (internal transcribed spacers 1 and 2), Fusarium spp. (elongation factor-1α gene fragment) and oomycetes (internal transcribed spacers 1). The composition of the fungal community changed during storage mostly influenced by the storage temperature followed by a weak environmental effect. Botrytis cinerea was the prevalent species at 8°C whereas members of the fungal genera Fusarium and Penicillium became dominant at 20°C. This shift was independent of the plant genotype. Species richness within the genus Fusarium also increased during storage at both temperatures whereas the oomycetes community did not change. Moreover, oomycetes species were absent after storage at 20°C. The results of the present study clearly show that rot development during sugar beet storage is associated with pathogens well known as causal agents of post-harvest diseases in many other crops. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Comparative analysis and validation of the malachite green assay for the high throughput biochemical characterization of terpene synthases.

    Science.gov (United States)

    Vardakou, Maria; Salmon, Melissa; Faraldos, Juan A; O'Maille, Paul E

    2014-01-01

    Terpenes are the largest group of natural products with important and diverse biological roles, while of tremendous economic value as fragrances, flavours and pharmaceutical agents. Class-I terpene synthases (TPSs), the dominant type of TPS enzymes, catalyze the conversion of prenyl diphosphates to often structurally diverse bioactive terpene hydrocarbons, and inorganic pyrophosphate (PPi). To measure their kinetic properties, current bio-analytical methods typically rely on the direct detection of hydrocarbon products by radioactivity measurements or gas chromatography-mass spectrometry (GC-MS). In this study we employed an established, rapid colorimetric assay, the pyrophosphate/malachite green assay (MG), as an alternative means for the biochemical characterization of class I TPSs activity.•We describe the adaptation of the MG assay for turnover and catalytic efficiency measurements of TPSs.•We validate the method by direct comparison with established assays. The agreement of k cat/K M among methods makes this adaptation optimal for rapid evaluation of TPSs.•We demonstrate the application of the MG assay for the high-throughput screening of TPS gene libraries.

  9. High-throughput analysis of stimulus-evoked behaviors in Drosophila larva reveals multiple modality-specific escape strategies.

    Science.gov (United States)

    Ohyama, Tomoko; Jovanic, Tihana; Denisov, Gennady; Dang, Tam C; Hoffmann, Dominik; Kerr, Rex A; Zlatic, Marta

    2013-01-01

    All organisms react to noxious and mechanical stimuli but we still lack a complete understanding of cellular and molecular mechanisms by which somatosensory information is transformed into appropriate motor outputs. The small number of neurons and excellent genetic tools make Drosophila larva an especially tractable model system in which to address this problem. We developed high throughput assays with which we can simultaneously expose more than 1,000 larvae per man-hour to precisely timed noxious heat, vibration, air current, or optogenetic stimuli. Using this hardware in combination with custom software we characterized larval reactions to somatosensory stimuli in far greater detail than possible previously. Each stimulus evoked a distinctive escape strategy that consisted of multiple actions. The escape strategy was context-dependent. Using our system we confirmed that the nociceptive class IV multidendritic neurons were involved in the reactions to noxious heat. Chordotonal (ch) neurons were necessary for normal modulation of head casting, crawling and hunching, in response to mechanical stimuli. Consistent with this we observed increases in calcium transients in response to vibration in ch neurons. Optogenetic activation of ch neurons was sufficient to evoke head casting and crawling. These studies significantly increase our understanding of the functional roles of larval ch neurons. More generally, our system and the detailed description of wild type reactions to somatosensory stimuli provide a basis for systematic identification of neurons and genes underlying these behaviors.

  10. Weak Interactions Govern the Viscosity of Concentrated Antibody Solutions: High-Throughput Analysis Using the Diffusion Interaction Parameter

    Science.gov (United States)

    Connolly, Brian D.; Petry, Chris; Yadav, Sandeep; Demeule, Barthélemy; Ciaccio, Natalie; Moore, Jamie M.R.; Shire, Steven J.; Gokarn, Yatin R.

    2012-01-01

    Weak protein-protein interactions are thought to modulate the viscoelastic properties of concentrated antibody solutions. Predicting the viscoelastic behavior of concentrated antibodies from their dilute solution behavior is of significant interest and remains a challenge. Here, we show that the diffusion interaction parameter (kD), a component of the osmotic second virial coefficient (B2) that is amenable to high-throughput measurement in dilute solutions, correlates well with the viscosity of concentrated monoclonal antibody (mAb) solutions. We measured the kD of 29 different mAbs (IgG1 and IgG4) in four different solvent conditions (low and high ion normality) and found a linear dependence between kD and the exponential coefficient that describes the viscosity concentration profiles (|R| ≥ 0.9). Through experimentally measured effective charge measurements, under low ion normality where the electroviscous effect can dominate, we show that the mAb solution viscosity is poorly correlated with the mAb net charge (|R| ≤ 0.6). With this large data set, our results provide compelling evidence in support of weak intermolecular interactions, in contrast to the notion that the electroviscous effect is important in governing the viscoelastic behavior of concentrated mAb solutions. Our approach is particularly applicable as a screening tool for selecting mAbs with desirable viscosity properties early during lead candidate selection. PMID:22828333

  11. Patient access in plastic surgery: an operational and financial analysis of service-based interventions to improve ambulatory throughput in an academic surgery practice.

    Science.gov (United States)

    Hultman, Charles Scott; Gilland, Wendell G; Weir, Samuel

    2015-06-01

    Inefficient patient throughput in a surgery practice can result in extended new patient backlogs, excessively long cycle times in the outpatient clinics, poor patient satisfaction, decreased physician productivity, and loss of potential revenue. This project assesses the efficacy of multiple throughput interventions in an academic, plastic surgery practice at a public university. We implemented a Patient Access and Efficiency (PAcE) initiative, funded and sponsored by our health care system, to improve patient throughput in the outpatient surgery clinic. Interventions included: (1) creation of a multidisciplinary team, led by a project redesign manager, that met weekly; (2) definition of goals, metrics, and target outcomes; 3) revision of clinic templates to reflect actual demand; 4) working down patient backlog through group visits; 5) booking new patients across entire practice; 6) assigning a physician's assistant to the preoperative clinic; and 7) designating a central scheduler to coordinate flow of information. Main outcome measures included: patient satisfaction using Press-Ganey surveys; complaints reported to patient relations; time to third available appointment; size of patient backlog; monthly clinic volumes with utilization rates and supply/demand curves; "chaos" rate (cancellations plus reschedules, divided by supply, within 48 hours of booked clinic date); patient cycle times with bottleneck analysis; physician productivity measured by work Relative Value Units (wRVUs); and downstream financial effects on billing, collection, accounts receivable (A/R), and payer mix. We collected, managed, and analyzed the data prospectively, comparing the pre-PAcE period (6 months) with the PAcE period (6 months). The PAcE initiative resulted in multiple improvements across the entire plastic surgery practice. Patient satisfaction increased only slightly from 88.5% to 90.0%, but the quarterly number of complaints notably declined from 17 to 9. Time to third

  12. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    Science.gov (United States)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-07-01

    A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.

  13. Venous oxygen saturation.

    Science.gov (United States)

    Hartog, Christiane; Bloos, Frank

    2014-12-01

    Early detection and rapid treatment of tissue hypoxia are important goals. Venous oxygen saturation is an indirect index of global oxygen supply-to-demand ratio. Central venous oxygen saturation (ScvO2) measurement has become a surrogate for mixed venous oxygen saturation (SvO2). ScvO2 is measured by a catheter placed in the superior vena cava. After results from a single-center study suggested that maintaining ScvO2 values >70% might improve survival rates in septic patients, international practice guidelines included this target in a bundle strategy to treat early sepsis. However, a recent multicenter study with >1500 patients found that the use of central hemodynamic and ScvO2 monitoring did not improve long-term survival when compared to the clinical assessment of the adequacy of circulation. It seems that if sepsis is recognized early, a rapid initiation of antibiotics and adequate fluid resuscitation are more important than measuring venous oxygen saturation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Metamaterial saturable absorber mirror.

    Science.gov (United States)

    Dayal, Govind; Ramakrishna, S Anantha

    2013-02-01

    We propose a metamaterial saturable absorber mirror at midinfrared wavelengths that can show a saturation of absorption with intensity of incident light and switch to a reflecting state. The design consists of an array of circular metallic disks separated by a thin film of vanadium dioxide (VO(2)) from a continuous metallic film. The heating due to the absorption in the absorptive state causes the VO(2) to transit to a metallic phase from the low temperature insulating phase. The metamaterial switches from an absorptive state (R≃0.1%) to a reflective state (R>95%) for a specific threshold intensity of the incident radiation corresponding to the phase transition of VO(2), resulting in the saturation of absorption in the metamaterial. The computer simulations show over 99.9% peak absorbance, a resonant bandwidth of about 0.8 μm at 10.22 μm wavelengths, and saturation intensity of 140 mW cm(-2) for undoped VO(2) at room temperature. We also carried out numerical simulations to investigate the effects of localized heating and temperature distribution by solving the heat diffusion problem.

  15. On the Achievable Throughput Over TVWS Sensor Networks.

    Science.gov (United States)

    Caleffi, Marcello; Cacciapuoti, Angela Sara

    2016-03-30

    In this letter, we study the throughput achievable by an unlicensed sensor network operating over TV white space spectrum in presence of coexistence interference. Through the letter, we first analytically derive the achievable throughput as a function of the channel ordering. Then, we show that the problem of deriving the maximum expected throughput through exhaustive search is computationally unfeasible. Finally, we derive a computational-efficient algorithm characterized by polynomial-time complexity to compute the channel set maximizing the expected throughput and, stemming from this, we derive a closed-form expression of the maximum expected throughput. Numerical simulations validate the theoretical analysis.

  16. Synthetic Biomaterials to Rival Nature's Complexity-a Path Forward with Combinatorics, High-Throughput Discovery, and High-Content Analysis.

    Science.gov (United States)

    Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A

    2017-10-01

    Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Uplink SDMA with Limited Feedback: Throughput Scaling

    Directory of Open Access Journals (Sweden)

    Jeffrey G. Andrews

    2008-01-01

    Full Text Available Combined space division multiple access (SDMA and scheduling exploit both spatial multiplexing and multiuser diversity, increasing throughput significantly. Both SDMA and scheduling require feedback of multiuser channel sate information (CSI. This paper focuses on uplink SDMA with limited feedback, which refers to efficient techniques for CSI quantization and feedback. To quantify the throughput of uplink SDMA and derive design guidelines, the throughput scaling with system parameters is analyzed. The specific parameters considered include the numbers of users, antennas, and feedback bits. Furthermore, different SNR regimes and beamforming methods are considered. The derived throughput scaling laws are observed to change for different SNR regimes. For instance, the throughput scales logarithmically with the number of users in the high SNR regime but double logarithmically in the low SNR regime. The analysis of throughput scaling suggests guidelines for scheduling in uplink SDMA. For example, to maximize throughput scaling, scheduling should use the criterion of minimum quantization errors for the high SNR regime and maximum channel power for the low SNR regime.

  18. Uncovering leaf rust responsive miRNAs in wheat (Triticum aestivum L.) using high-throughput sequencing and prediction of their targets through degradome analysis.

    Science.gov (United States)

    Kumar, Dhananjay; Dutta, Summi; Singh, Dharmendra; Prabhu, Kumble Vinod; Kumar, Manish; Mukhopadhyay, Kunal

    2017-01-01

    Deep sequencing identified 497 conserved and 559 novel miRNAs in wheat, while degradome analysis revealed 701 targets genes. QRT-PCR demonstrated differential expression of miRNAs during stages of leaf rust progression. Bread wheat (Triticum aestivum L.) is an important cereal food crop feeding 30 % of the world population. Major threat to wheat production is the rust epidemics. This study was targeted towards identification and functional characterizations of micro(mi)RNAs and their target genes in wheat in response to leaf rust ingression. High-throughput sequencing was used for transcriptome-wide identification of miRNAs and their expression profiling in retort to leaf rust using mock and pathogen-inoculated resistant and susceptible near-isogenic wheat plants. A total of 1056 mature miRNAs were identified, of which 497 miRNAs were conserved and 559 miRNAs were novel. The pathogen-inoculated resistant plants manifested more miRNAs compared with the pathogen infected susceptible plants. The miRNA counts increased in susceptible isoline due to leaf rust, conversely, the counts decreased in the resistant isoline in response to pathogenesis illustrating precise spatial tuning of miRNAs during compatible and incompatible interaction. Stem-loop quantitative real-time PCR was used to profile 10 highly differentially expressed miRNAs obtained from high-throughput sequencing data. The spatio-temporal profiling validated the differential expression of miRNAs between the isolines as well as in retort to pathogen infection. Degradome analysis provided 701 predicted target genes associated with defense response, signal transduction, development, metabolism, and transcriptional regulation. The obtained results indicate that wheat isolines employ diverse arrays of miRNAs that modulate their target genes during compatible and incompatible interaction. Our findings contribute to increase knowledge on roles of microRNA in wheat-leaf rust interactions and could help in rust

  19. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  20. High throughput flow cytometry based yeast two-hybrid array approach for large-scale analysis of protein-protein interactions

    Science.gov (United States)

    Chen, Jun; Carter, Mark B.; Edwards, Bruce S.; Cai, Hong; Sklar, Larry A.

    2011-01-01

    The analysis of protein-protein-interactions is a key focus of proteomics efforts. The yeast two-hybrid system has been the most commonly used method in genome-wide searches for protein interaction partners. However, the throughput of the current yeast two-hybrid array approach is hampered by the involvement of the time-consuming LacZ assay and/or the incompatibility of liquid handling automation due to the requirement for selection of colonies/diploids on agar plates. To facilitate large-scale yeast two-hybrid assays, we report a novel array approach by coupling a GFP reporter based yeast two-hybrid system with high throughput flow cytometry that enables the processing of a 96 well plate in as little as 3 minutes. In this approach, the yEGFP reporter has been established in both AH109 (MATa) and Y187 (MATα) reporter cells. It not only allows the generation of two copies of GFP reporter genes in diploid cells, but also allows the convenient determination of self-activators generated from both bait and prey constructs by flow cytometry. We demonstrate a Y2H array assay procedure that is carried out completely in liquid media in 96-well plates by mating bait and prey cells in liquid YPD media, selecting the diploids containing positive interaction pairs in selective media and analyzing the GFP reporter directly by flow cytometry. We have evaluated this flow cytometry based array procedure by showing that the interaction of the positive control pair P53/T is able to be reproducibly detected at 72 hrs post-mating compared to the negative control pairs. We conclude that our flow cytometry based yeast two-hybrid approach is robust, convenient, quantitative, and is amenable to large-scale analysis using liquid-handling automation. PMID:21954189

  1. Detailed Analysis of the Surface Area and Elasticity in the Saturated 1,2-Diacylphosphatidylcholine/Cholesterol Binary Monolayer System.

    Science.gov (United States)

    Miyoshi, Tsubasa; Kato, Satoru

    2015-08-25

    The surface pressure-area (π-A) isotherms of DMPC, DPPC, and DSPC/cholesterol binary monolayers were systematically measured with great care to gain insight into the lateral molecular packing in these binary monolayer systems. The average molecular area A and the area elastic modulus C(s)⁻¹ at a given surface pressure were calculated as a function of cholesterol mole fraction x(chol). As a result, data reliable enough for the analysis of detailed phase behavior were obtained. We identified several characteristic phase regions and assigned the phase state in each region on the basis of the deviation of A(x(chol)) and C(s)⁻¹(x(chol)) from ideal additivity. We also estimated the partial molecular areas of DMPC, DPPC, DSPC, and cholesterol in the single-phase regions, where C(s)⁻¹(x(chol)) values fell on an ideal additivity curve. We found that the addition of cholesterol induces the formation of a highly condensed phase where the diacylphosphatidylcholine (diacyl PC) molecule has a surface area even smaller than that in the solid phase, irrespective of the surface pressure and the chain length of diacyl PC. Here, we call the cholesterol-induced condensed phase the CC phase. Furthermore, we demonstrated that the basic features of A(x(chol)) and C(s)⁻¹(x(chol)) profiles can be explained semiquantitatively by assuming the state of vicinity lipids surrounding sparsely distributed cholesterol molecules in the low x(chol) region as a third state of the diacyl PC molecule in addition to the states in the pure diacyl PC monolayer and in the CC phase.

  2. Temporal dynamics of soil microbial communities under different moisture regimes: high-throughput sequencing and bioinformatics analysis

    Science.gov (United States)

    Semenov, Mikhail; Zhuravleva, Anna; Semenov, Vyacheslav; Yevdokimov, Ilya; Larionova, Alla

    2017-04-01

    Recent climate scenarios predict not only continued global warming but also an increased frequency and intensity of extreme climatic events such as strong changes in temperature and precipitation regimes. Microorganisms are well known to be more sensitive to changes in environmental conditions than to other soil chemical and physical parameters. In this study, we determined the shifts in soil microbial community structure as well as indicative taxa in soils under three moisture regimes using high-throughput Illumina sequencing and range of bioinformatics approaches for the assessment of sequence data. Incubation experiments were performed in soil-filled (Greyic Phaeozems Albic) rhizoboxes with maize and without plants. Three contrasting moisture regimes were being simulated: 1) optimal wetting (OW), a watering 2-3 times per week to maintain soil moisture of 20-25% by weight; 2) periodic wetting (PW), with alternating periods of wetting and drought; and 3) constant insufficient wetting (IW), while soil moisture of 12% by weight was permanently maintained. Sampled fresh soils were homogenized, and the total DNA of three replicates was extracted using the FastDNA® SPIN kit for Soil. DNA replicates were combined in a pooled sample and the DNA was used for PCR with specific primers for the 16S V3 and V4 regions. In order to compare variability between different samples and replicates within a single sample, some DNA replicates treated separately. The products were purified and submitted to Illumina MiSeq sequencing. Sequence data were evaluated by alpha-diversity (Chao1 and Shannon H' diversity indexes), beta-diversity (UniFrac and Bray-Curtis dissimilarity), heatmap, tagcloud, and plot-bar analyses using the MiSeq Reporter Metagenomics Workflow and R packages (phyloseq, vegan, tagcloud). Shannon index varied in a rather narrow range (4.4-4.9) with the lowest values for microbial communities under PW treatment. Chao1 index varied from 385 to 480, being a more flexible

  3. High-throughput pseudovirion-based neutralization assay for analysis of natural and vaccine-induced antibodies against human papillomaviruses.

    Directory of Open Access Journals (Sweden)

    Peter Sehr

    Full Text Available A highly sensitive, automated, purely add-on, high-throughput pseudovirion-based neutralization assay (HT-PBNA with excellent repeatability and run-to-run reproducibility was developed for human papillomavirus types (HPV 16, 18, 31, 45, 52, 58 and bovine papillomavirus type 1. Preparation of 384 well assay plates with serially diluted sera and the actual cell-based assay are separated in time, therefore batches of up to one hundred assay plates can be processed sequentially. A mean coefficient of variation (CV of 13% was obtained for anti-HPV 16 and HPV 18 titers for a standard serum tested in a total of 58 repeats on individual plates in seven independent runs. Natural antibody response was analyzed in 35 sera from patients with HPV 16 DNA positive cervical intraepithelial neoplasia grade 2+ lesions. The new HT-PBNA is based on Gaussia luciferase with increased sensitivity compared to the previously described manual PBNA (manPBNA based on secreted alkaline phosphatase as reporter. Titers obtained with HT-PBNA were generally higher than titers obtained with the manPBNA. A good linear correlation (R(2 = 0.7 was found between HT-PBNA titers and anti-HPV 16 L1 antibody-levels determined by a Luminex bead-based GST-capture assay for these 35 sera and a Kappa-value of 0.72, with only 3 discordant sera in the low titer range. In addition to natural low titer antibody responses the high sensitivity of the HT-PBNA also allows detection of cross-neutralizing antibodies induced by commercial HPV L1-vaccines and experimental L2-vaccines. When analyzing the WHO international standards for HPV 16 and 18 we determined an analytical sensitivity of 0.864 and 1.105 mIU, respectively.

  4. An UPLC-MS/MS method for highly sensitive high-throughput analysis of phytohormones in plant tissues

    Directory of Open Access Journals (Sweden)

    Balcke Gerd Ulrich

    2012-11-01

    Full Text Available Abstract Background Phytohormones are the key metabolites participating in the regulation of multiple functions of plant organism. Among them, jasmonates, as well as abscisic and salicylic acids are responsible for triggering and modulating plant reactions targeted against pathogens and herbivores, as well as resistance to abiotic stress (drought, UV-irradiation and mechanical wounding. These factors induce dramatic changes in phytohormone biosynthesis and transport leading to rapid local and systemic stress responses. Understanding of underlying mechanisms is of principle interest for scientists working in various areas of plant biology. However, highly sensitive, precise and high-throughput methods for quantification of these phytohormones in small samples of plant tissues are still missing. Results Here we present an LC-MS/MS method for fast and highly sensitive determination of jasmonates, abscisic and salicylic acids. A single-step sample preparation procedure based on mixed-mode solid phase extraction was efficiently combined with essential improvements in mobile phase composition yielding higher efficiency of chromatographic separation and MS-sensitivity. This strategy resulted in dramatic increase in overall sensitivity, allowing successful determination of phytohormones in small (less than 50 mg of fresh weight tissue samples. The method was completely validated in terms of analyte recovery, sensitivity, linearity and precision. Additionally, it was cross-validated with a well-established GC-MS-based procedure and its applicability to a variety of plant species and organs was verified. Conclusion The method can be applied for the analyses of target phytohormones in small tissue samples obtained from any plant species and/or plant part relying on any commercially available (even less sensitive tandem mass spectrometry instrumentation.

  5. High-throughput microarray technology in diagnostics of enterobacteria based on genome-wide probe selection and regression analysis.

    Science.gov (United States)

    Friedrich, Torben; Rahmann, Sven; Weigel, Wilfried; Rabsch, Wolfgang; Fruth, Angelika; Ron, Eliora; Gunzer, Florian; Dandekar, Thomas; Hacker, Jörg; Müller, Tobias; Dobrindt, Ulrich

    2010-10-21

    horizontal gene transfer. Moreover, a broad range of pathogens have been covered by an efficient probe set size enabling the design of high-throughput diagnostics.

  6. An UPLC-MS/MS method for highly sensitive high-throughput analysis of phytohormones in plant tissues

    Science.gov (United States)

    2012-01-01

    Background Phytohormones are the key metabolites participating in the regulation of multiple functions of plant organism. Among them, jasmonates, as well as abscisic and salicylic acids are responsible for triggering and modulating plant reactions targeted against pathogens and herbivores, as well as resistance to abiotic stress (drought, UV-irradiation and mechanical wounding). These factors induce dramatic changes in phytohormone biosynthesis and transport leading to rapid local and systemic stress responses. Understanding of underlying mechanisms is of principle interest for scientists working in various areas of plant biology. However, highly sensitive, precise and high-throughput methods for quantification of these phytohormones in small samples of plant tissues are still missing. Results Here we present an LC-MS/MS method for fast and highly sensitive determination of jasmonates, abscisic and salicylic acids. A single-step sample preparation procedure based on mixed-mode solid phase extraction was efficiently combined with essential improvements in mobile phase composition yielding higher efficiency of chromatographic separation and MS-sensitivity. This strategy resulted in dramatic increase in overall sensitivity, allowing successful determination of phytohormones in small (less than 50 mg of fresh weight) tissue samples. The method was completely validated in terms of analyte recovery, sensitivity, linearity and precision. Additionally, it was cross-validated with a well-established GC-MS-based procedure and its applicability to a variety of plant species and organs was verified. Conclusion The method can be applied for the analyses of target phytohormones in small tissue samples obtained from any plant species and/or plant part relying on any commercially available (even less sensitive) tandem mass spectrometry instrumentation. PMID:23173950

  7. High-throughput sequence analysis of small RNAs in grapevine (Vitis vinifera L.) affected by grapevine leafroll disease.

    Science.gov (United States)

    Alabi, Olufemi J; Zheng, Yun; Jagadeeswaran, Guru; Sunkar, Ramanjulu; Naidu, Rayapati A

    2012-12-01

    Grapevine leafroll disease (GLRD) is one of the most economically important virus diseases of grapevine (Vitis spp.) worldwide. In this study, we used high-throughput sequencing of cDNA libraries made from small RNAs (sRNAs) to compare profiles of sRNA populations recovered from own-rooted Merlot grapevines with and without GLRD symptoms. The data revealed the presence of sRNAs specific to Grapevine leafroll-associated virus 3, Hop stunt viroid (HpSVd), Grapevine yellow speckle viroid 1 (GYSVd-1) and Grapevine yellow speckle viroid 2 (GYSVd-2) in symptomatic grapevines and sRNAs specific only to HpSVd, GYSVd-1 and GYSVd-2 in nonsymptomatic grapevines. In addition to 135 previously identified conserved microRNAs in grapevine (Vvi-miRs), we identified 10 novel and several candidate Vvi-miRs in both symptomatic and nonsymptomatic grapevine leaves based on the cloning of miRNA star sequences. Quantitative real-time reverse transcriptase-polymerase chain reaction (RT-PCR) of selected conserved Vvi-miRs indicated that individual members of an miRNA family are differentially expressed in symptomatic and nonsymptomatic leaves. The high-resolution mapping of sRNAs specific to an ampelovirus and three viroids in mixed infections, the identification of novel Vvi-miRs and the modulation of certain conserved Vvi-miRs offers resources for the further elucidation of compatible host-pathogen interactions and for the provision of ecologically relevant information to better understand host-pathogen-environment interactions in a perennial fruit crop. © 2012 THE AUTHORS. MOLECULAR PLANT PATHOLOGY © 2012 BSPP AND BLACKWELL PUBLISHING LTD.

  8. High-Throughput All-Optical Analysis of Synaptic Transmission and Synaptic Vesicle Recycling in Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Sebastian Wabnig

    Full Text Available Synaptic vesicles (SVs undergo a cycle of biogenesis and membrane fusion to release transmitter, followed by recycling. How exocytosis and endocytosis are coupled is intensively investigated. We describe an all-optical method for identification of neurotransmission genes that can directly distinguish SV recycling factors in C. elegans, by motoneuron photostimulation and muscular RCaMP Ca2+ imaging. We verified our approach on mutants affecting synaptic transmission. Mutation of genes affecting SV recycling (unc-26 synaptojanin, unc-41 stonin, unc-57 endophilin, itsn-1 intersectin, snt-1 synaptotagmin showed a distinct 'signature' of muscle Ca2+ dynamics, induced by cholinergic motoneuron photostimulation, i.e. faster rise, and earlier decrease of the signal, reflecting increased synaptic fatigue during ongoing photostimulation. To facilitate high throughput, we measured (3-5 times ~1000 nematodes for each gene. We explored if this method enables RNAi screening for SV recycling genes. Previous screens for synaptic function genes, based on behavioral or pharmacological assays, allowed no distinction of the stage of the SV cycle in which a protein might act. We generated a strain enabling RNAi specifically only in cholinergic neurons, thus resulting in healthier animals and avoiding lethal phenotypes resulting from knockdown elsewhere. RNAi of control genes resulted in Ca2+ measurements that were consistent with results obtained in the respective genomic mutants, albeit to a weaker extent in most cases, and could further be confirmed by opto-electrophysiological measurements for mutants of some of the genes, including synaptojanin. We screened 95 genes that were previously implicated in cholinergic transmission, and several controls. We identified genes that clustered together with known SV recycling genes, exhibiting a similar signature of their Ca2+ dynamics. Five of these genes (C27B7.7, erp-1, inx-8, inx-10, spp-10 were further assessed in

  9. Modeling and sensitivity analysis on the transport of aluminum oxide nanoparticles in saturated sand: effects of ionic strength, flow rate, and nanoparticle concentration.

    Science.gov (United States)

    Rahman, Tanzina; Millwater, Harry; Shipley, Heather J

    2014-11-15

    Aluminum oxide nanoparticles have been widely used in various consumer products and there are growing concerns regarding their exposure in the environment. This study deals with the modeling, sensitivity analysis and uncertainty quantification of one-dimensional transport of nano-sized (~82 nm) aluminum oxide particles in saturated sand. The transport of aluminum oxide nanoparticles was modeled using a two-kinetic-site model with a blocking function. The modeling was done at different ionic strengths, flow rates, and nanoparticle concentrations. The two sites representing fast and slow attachments along with a blocking term yielded good agreement with the experimental results from the column studies of aluminum oxide nanoparticles. The same model was used to simulate breakthrough curves under different conditions using experimental data and calculated 95% confidence bounds of the generated breakthroughs. The sensitivity analysis results showed that slow attachment was the most sensitive parameter for high influent concentrations (e.g. 150 mg/L Al2O3) and the maximum solid phase retention capacity (related to blocking function) was the most sensitive parameter for low concentrations (e.g. 50 mg/L Al2O3). Copyright © 2014 Elsevier B.V. All rights reserved.

  10. An analysis of sodium, total fat and saturated fat contents of packaged food products advertised in Bronx-based supermarket circulars.

    Science.gov (United States)

    Samuel, L; Basch, C H; Ethan, D; Hammond, R; Chiazzese, K

    2014-08-01

    Americans' consumption of sodium, fat, and saturated fat exceed federally recommended limits for these nutrients and has been identified as a preventable leading cause of hypertension and cardiovascular disease. More than 40% of the Bronx population comprises African-Americans, who have increased risk and earlier onset of hypertension and are also genetically predisposed to salt-sensitive hypertension. This study analyzed nutrition information for packaged foods advertised in Bronx-based supermarket circulars. Federally recommended limits for sodium, saturated fat and total fat contents were used to identify foods that were high in these nutrients. The proportion of these products with respect to the total number of packaged foods was calculated. More than a third (35%) and almost a quarter (24%) of the 898 advertised packaged foods were high in saturated fat and sodium respectively. Such foods predominantly included processed meat and fish products, fast foods, meals, entrees and side dishes. Dairy and egg products were the greatest contributors of high saturated fat. Pork and beef products, fast foods, meals, entrees and side dishes had the highest median values for sodium, total fat and saturated fat content. The high proportion of packaged foods that are high in sodium and/or saturated fat promoted through supermarket circulars highlights the need for nutrition education among consumers as well as collaborative public health measures by the food industry, community and government agencies to reduce the amounts of sodium and saturated fat in these products and limit the promotion of foods that are high in these nutrients.

  11. A simple, high throughput method to locate single copy sequences from Bacterial Artificial Chromosome (BAC libraries using High Resolution Melt analysis

    Directory of Open Access Journals (Sweden)

    Caligari Peter DS

    2010-05-01

    Full Text Available Abstract Background The high-throughput anchoring of genetic markers into contigs is required for many ongoing physical mapping projects. Multidimentional BAC pooling strategies for PCR-based screening of large insert libraries is a widely used alternative to high density filter hybridisation of bacterial colonies. To date, concerns over reliability have led most if not all groups engaged in high throughput physical mapping projects to favour BAC DNA isolation prior to amplification by conventional PCR. Results Here, we report the first combined use of Multiplex Tandem PCR (MT-PCR and High Resolution Melt (HRM analysis on bacterial stocks of BAC library superpools as a means of rapidly anchoring markers to BAC colonies and thereby to integrate genetic and physical maps. We exemplify the approach using a BAC library of the model plant Arabidopsis thaliana. Super pools of twenty five 384-well plates and two-dimension matrix pools of the BAC library were prepared for marker screening. The entire procedure only requires around 3 h to anchor one marker. Conclusions A pre-amplification step during MT-PCR allows high multiplexing and increases the sensitivity and reliability of subsequent HRM discrimination. This simple gel-free protocol is more reliable, faster and far less costly than conventional PCR screening. The option to screen in parallel 3 genetic markers in one MT-PCR-HRM reaction using templates from directly pooled bacterial stocks of BAC-containing bacteria further reduces time for anchoring markers in physical maps of species with large genomes.

  12. [Biological ingredient analysis of traditional Chinese medicines utilizing metagenomic approach based on high-throughput-sequencing and big-data-mining].

    Science.gov (United States)

    Bai, Hong; Ning, Kang; Wang, Chang-yun

    2015-03-01

    The quality of traditional Chinese medicines (TCMs) has been mainly evaluated based on chemical ingredients, yet recently more attentions have been paid on biological ingredients, especially for pill-based preparations. It is a key approach to establish a fast, accurate and systematic method of biological ingredient analysis for realization of modernization, industrialization and internationalization of TCMs. The biological ingredient analysis of TCM preparations could be abstracted as the identification of multiple species from a biological mixture. The metagenomic approach based on high-throughput-sequencing (HTS) and big-data-mining has been considered as one of the most effective methods for multiple species analysis of a biological mixture, which would also be helpful for the analysis of biological ingredients in TCMs. Simultaneous identification of diverse species, including the prescribed species, adulterants, toxic species, protected species and even the biological impurities introduced through production process, could be achieved by selecting appropriate DNA biomarkers, as well as applying large-scale sequence comparison and data mining. By this approach, it is prospective to offer an evaluation basis for the effectiveness, safety and legality of TCM preparations.

  13. Saturated and trans fats

    National Research Council Canada - National Science Library

    Shader, Richard I

    2014-01-01

    ... Original Pancake Mix plus ingredients suggested by the recipe: 2 g saturated fat (SF) and no trans fatty acids or trans fat (TFA); bacon, Oscar Mayer Lower Sodium Bacon: 2.5 g SF and no TFA; sausages, Jimmy Dean Original Pork Sausage Links: 8 g SF and no TFA; potatoes, Ore-Ida Mini Tater Tots: 2 g SF and no TFA; and nondairy creamer, Nestlé Coffee-...

  14. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  15. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  16. Offline Solid-phase Extraction Large-volume Injection-Gas chromatography for the Analysis of Mineral Oil-saturated Hydrocarbons in Commercial Vegetable Oils.

    Science.gov (United States)

    Liu, Lingling; Huang, Hua; Wu, Yanwen; Li, Bingning; Ouyang, Jie

    2017-09-01

    An offline solid-phase extraction (SPE) approach combined with a large-volume injection (LVI)-gas chromatography-flame ionization detector (LVI-GC-FID) is improved for routine analysis of mineral oil saturated hydrocarbons (MOSH) in vegetable oils. The key procedure of the method consists in using offline SPE columns for MOSH purification. The SPE column packed with 1% Ag-activated silica gel was used to separate MOSH from triglycerides and olefins in variety of vegetable oils. The eluent of MOSH fraction was only 3 mL and the concentration step was quick with little evaporation loss. The limit of quantification (LOQ) of the method was 2.5 mg/kg and the linearity ranged from 2 to 300 mg/kg. The accuracy was assessed by measuring the recoveries from spiked oil samples and was higher than 90%. Twenty-seven commercial vegetable oils were analyzed, and different levels of MOSH contamination were detected with the highest being 259.4 mg/kg. The results suggested that it is necessary to routinely detect mineral oil contamination in vegetable oils for food safety.

  17. Modelling the throughput capacity of a single-accelerator multitreatment room proton therapy centre

    Science.gov (United States)

    Aitkenhead, A H; Bugg, D; Rowbottom, C G; Smith, E; Mackay, R I

    2012-01-01

    Objective We describe a model for evaluating the throughput capacity of a single-accelerator multitreatment room proton therapy centre with the aims of (1) providing quantitative estimates of the throughput and waiting times and (2) providing insight into the sensitivity of the system to various physical parameters. Methods A Monte Carlo approach was used to compute various statistics about the modelled centre, including the throughput capacity, fraction times for different groups of patients and beam waiting times. A method of quantifying the saturation level is also demonstrated. Results Benchmarking against the MD Anderson Cancer Center showed good agreement between the modelled (140±4 fractions per day) and reported (133±35 fractions per day) throughputs. A sensitivity analysis of that system studied the impact of beam switch time, the number of treatment rooms, patient set-up times and the potential benefit of having a second accelerator. Finally, scenarios relevant to a potential UK facility were studied, finding that a centre with the same four-room, single-accelerator configuration as the MD Anderson Cancer Center but handling a more complex UK-type caseload would have a throughput reduced by approximately 19%, but still be capable of treating in excess of 100 fractions per 16-h treatment day. Conclusions The model provides a useful tool to aid in understanding the operating dynamics of a proton therapy facility, and for investigating potential scenarios for prospective centres. Advances in knowledge The model helps to identify which technical specifications should be targeted for future improvements. PMID:23175492

  18. Development of Transmission Raman Spectroscopy towards the in line, high throughput and non-destructive quantitative analysis of pharmaceutical solid oral dose.

    Science.gov (United States)

    Griffen, Julia A; Owen, Andrew W; Matousek, Pavel

    2015-01-07

    Transmission Raman spectroscopy (TRS) is a recently introduced analytical technique to pharmaceutical analysis permitting volumetric sampling by non-destructive means. Here we demonstrate experimentally, for the first time, the enhanced speed of quantification of pharmaceutical tablets by an order of magnitude compared with conventional TRS. This is achieved using an enhancing element, "photon diode", avoiding the loss of laser photons at laser coupling interface. The proof-of-concept experiments were performed on a complex mixture consisting of 5 components (3 APIs and 2 excipients) with nominal concentrations ranging between 0.4 and 89%. Acquisition times as short as 0.01 s were reached with satisfactory quantification accuracy for all the sample components. Results suggest that even faster sampling speeds would be achievable for components with stronger Raman scattering cross sections or with higher laser powers. This major improvement in speed of volumetric analysis enables high throughput deployment of TRS for in line quality control applications within the batch or continuous manufacturing process and facilitating non-destructive analysis of large fractions.

  19. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    Energy Technology Data Exchange (ETDEWEB)

    Harding, Louisa B. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Schultz, Irvin R. [Battelle, Marine Sciences Laboratory – Pacific Northwest National Laboratory, 1529 West Sequim Bay Road, Sequim, WA 98382 (United States); Goetz, Giles W. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Luckenbach, J. Adam [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Young, Graham [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Goetz, Frederick W. [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, Manchester Research Station, P.O. Box 130, Manchester, WA 98353 (United States); Swanson, Penny, E-mail: penny.swanson@noaa.gov [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States)

    2013-10-15

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina{sup ®} sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  20. Saturation in nuclei

    CERN Document Server

    Lappi, T

    2010-01-01

    This talk discusses some recent studies of gluon saturation in nuclei. We stress the connection between the initial condition in heavy ion collisions and observables in deep inelastic scattering (DIS). The dominant degree of freedom in the small x nuclear wavefunction is a nonperturbatively strong classical gluon field, which determines the initial condition for the glasma fields in the initial stages of a heavy ion collision. A correlator of Wilson lines from the same classical fields, known as the dipole cross section, can be used to compute many inclusive and exclusive observables in DIS.

  1. Ultra-Sensitive, High-Resolution Liquid Chromatography Methods for the High-Throughput Quantitative Analysis of Bacterial Cell Wall Chemistry and Structure.

    Science.gov (United States)

    Alvarez, Laura; Hernandez, Sara B; de Pedro, Miguel A; Cava, Felipe

    2016-01-01

    High-performance liquid chromatography (HPLC) analysis has been critical for determining the structural and chemical complexity of the cell wall. However this method is very time consuming in terms of sample preparation and chromatographic separation. Here we describe (1) optimized methods for peptidoglycan isolation from both Gram-negative and Gram-positive bacteria that dramatically reduce the sample preparation time, and (2) the application of the fast and highly efficient ultra-performance liquid chromatography (UPLC) technology to muropeptide separation and quantification. The advances in both analytical instrumentation and stationary-phase chemistry have allowed for evolved protocols which cut run time from hours (2-3 h) to minutes (10-20 min), and sample demands by at least one order of magnitude. Furthermore, development of methods based on organic solvents permits in-line mass spectrometry (MS) of the UPLC-resolved muropeptides. Application of these technologies to high-throughput analysis will expedite the better understanding of the cell wall biology.

  2. Association analysis of toluene exposure time with high-throughput mRNA expressions and methylation patterns using in vivo samples.

    Science.gov (United States)

    Hong, Ji Young; Yu, So Yeon; Kim, Seol Young; Ahn, Jeong Jin; Kim, Youngjoo; Kim, Gi Won; Son, Sang Wook; Park, Jong-Tae; Hwang, Seung Yong

    2016-04-01

    The emission of volatile organic compounds (VOCs) resulting from outdoor air pollution can contribute to major public health problems. However, there has been limited research on the health effects in humans from the inhalation of VOCs. Therefore, this study conducted an in vivo analysis of the effects of toluene, one of the most commonly used chemicals in many industries, on gene expression and methylation over time using the high-throughput technique of microarray analysis. We separated participants into three groups (control, short-term exposure, and long-term exposure) to investigate the influence of toluene exposure time on gene expression. We then comprehensively analyzed and investigated the correlation between variations in gene expression and the occurrence of methylation. Twenty-six genes were upregulated and hypomethylated, while 32 genes were downregulated and hypermethylated. The pathways of these genes were confirmed to be associated with cell survival and the immune system. Based on our findings, these genes can help predict the effects of time-dependent exposure to toluene on human health. Thus, observations from our data may have implications for the identification of biomarkers of toluene exposure. Copyright © 2015. Published by Elsevier Inc.

  3. A case study for cloud based high throughput analysis of NGS data using the globus genomics system.

    Science.gov (United States)

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the "Globus Genomics" system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  4. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    Directory of Open Access Journals (Sweden)

    Krithika Bhuvaneshwar

    2015-01-01

    Full Text Available Next generation sequencing (NGS technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  5. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    Science.gov (United States)

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  6. An efficient sample preparation method for high-throughput analysis of 15(S)-8-iso-PGF2α in plasma and urine by enzyme immunoassay.

    Science.gov (United States)

    Bielecki, A; Saravanabhavan, G; Blais, E; Vincent, R; Kumarathasan, P

    2012-01-01

    Although several methods have been reported on the analysis of the oxidative stress marker 15(S)-8-iso-prostaglandin-F2alpha (8-iso-PGF2α) in biological fluids, they either involve extensive sample preparation and costly technology or require high sample volume. This study presents a sample preparation method that utilizes low sample volume for 8-iso-PGF2α analysis in plasma and urine by an enzyme immunoassay (EIA). In brief, 8-iso-PGF2α in deproteinized plasma or native urine sample is complexed with an antibody and then captured by molecular weight cut-off filtration. This method was compared with two other sample preparation methods that are typically used in the analysis of 8-iso-PGF2α by EIA: Cayman's affinity column purification method and solid-phase extraction on C-18. The immunoaffinity purification method described here was superior to the other two sample preparation methods and yielded recovery values of 99.8 and 54.1% for 8-iso-PGF2α in plasma and urine, respectively. Analytical precision (relative standard deviation) was ±5% for plasma and ±15% for urine. The analysis of healthy human plasma and urine resulted in basal 8-iso-PGF2α levels of 31.8 ± 5.5 pg/mL and 2.9 ± 2.0 ng/mg creatinine, respectively. The robustness and analytical performance of this method makes it a promising tool for high-throughput screening of biological samples for 8-iso-PGF2α.

  7. The simple fool's guide to population genomics via RNA-Seq: An introduction to high-throughput sequencing data analysis

    DEFF Research Database (Denmark)

    De Wit, P.; Pespeni, M.H.; Ladner, J.T.

    2012-01-01

    ://sfg.stanford.edu, that includes detailed protocols for data processing and analysis, along with a repository of custom-made scripts and sample files. Steps included in the SFG range from tissue collection to de novo assembly, blast annotation, alignment, gene expression, functional enrichment, SNP detection, principal components...

  8. Systematic Analysis of the Association between Gut Flora and Obesity through High-Throughput Sequencing and Bioinformatics Approaches

    Directory of Open Access Journals (Sweden)

    Chih-Min Chiu

    2014-01-01

    Full Text Available Eighty-one stool samples from Taiwanese were collected for analysis of the association between the gut flora and obesity. The supervised analysis showed that the most, abundant genera of bacteria in normal samples (from people with a body mass index (BMI ≤ 24 were Bacteroides (27.7%, Prevotella (19.4%, Escherichia (12%, Phascolarctobacterium (3.9%, and Eubacterium (3.5%. The most abundant genera of bacteria in case samples (with a BMI ≥ 27 were Bacteroides (29%, Prevotella (21%, Escherichia (7.4%, Megamonas (5.1%, and Phascolarctobacterium (3.8%. A principal coordinate analysis (PCoA demonstrated that normal samples were clustered more compactly than case samples. An unsupervised analysis demonstrated that bacterial communities in the gut were clustered into two main groups: N-like and OB-like groups. Remarkably, most normal samples (78% were clustered in the N-like group, and most case samples (81% were clustered in the OB-like group (Fisher’s P  value=1.61E-07. The results showed that bacterial communities in the gut were highly associated with obesity. This is the first study in Taiwan to investigate the association between human gut flora and obesity, and the results provide new insights into the correlation of bacteria with the rising trend in obesity.

  9. Preliminary Analysis of High-Throughput Expression Data and Small RNA in Soybean Stem Tissue Infected with Sclerotinia sclerotiorum

    Science.gov (United States)

    We recently published a report on transcriptome changes in soybean stem tissue challenged with Sclerotinia sclerotiorum based on cDNA microarray analysis. We are now advancing this study by examining the differential expression of small RNA (miRNAs and siRNAs) and gene transcripts using the Illumin...

  10. Systematic analysis of the association between gut flora and obesity through high-throughput sequencing and bioinformatics approaches.

    Science.gov (United States)

    Chiu, Chih-Min; Huang, Wei-Chih; Weng, Shun-Long; Tseng, Han-Chi; Liang, Chao; Wang, Wei-Chi; Yang, Ting; Yang, Tzu-Ling; Weng, Chen-Tsung; Chang, Tzu-Hao; Huang, Hsien-Da

    2014-01-01

    Eighty-one stool samples from Taiwanese were collected for analysis of the association between the gut flora and obesity. The supervised analysis showed that the most, abundant genera of bacteria in normal samples (from people with a body mass index (BMI) ≤ 24) were Bacteroides (27.7%), Prevotella (19.4%), Escherichia (12%), Phascolarctobacterium (3.9%), and Eubacterium (3.5%). The most abundant genera of bacteria in case samples (with a BMI ≥ 27) were Bacteroides (29%), Prevotella (21%), Escherichia (7.4%), Megamonas (5.1%), and Phascolarctobacterium (3.8%). A principal coordinate analysis (PCoA) demonstrated that normal samples were clustered more compactly than case samples. An unsupervised analysis demonstrated that bacterial communities in the gut were clustered into two main groups: N-like and OB-like groups. Remarkably, most normal samples (78%) were clustered in the N-like group, and most case samples (81%) were clustered in the OB-like group (Fisher's P  value = 1.61E - 07). The results showed that bacterial communities in the gut were highly associated with obesity. This is the first study in Taiwan to investigate the association between human gut flora and obesity, and the results provide new insights into the correlation of bacteria with the rising trend in obesity.

  11. Metabolomic and Lipidomic Analysis of Serum Samples following Curcuma longa Extract Supplementation in High-Fructose and Saturated Fat Fed Rats.

    Science.gov (United States)

    Tranchida, Fabrice; Shintu, Laetitia; Rakotoniaina, Zo; Tchiakpe, Léopold; Deyris, Valérie; Hiol, Abel; Caldarelli, Stefano

    2015-01-01

    We explored, using nuclear magnetic resonance (NMR) metabolomics and fatty acids profiling, the effects of a common nutritional complement, Curcuma longa, at a nutritionally relevant dose with human use, administered in conjunction with an unbalanced diet. Indeed, traditional food supplements have been long used to counter metabolic impairments induced by unbalanced diets. Here, rats were fed either a standard diet, a high level of fructose and saturated fatty acid (HFS) diet, a diet common to western countries and that certainly contributes to the epidemic of insulin resistance (IR) syndrome, or a HFS diet with a Curcuma longa extract (1% of curcuminoids in the extract) for ten weeks. Orthogonal projections to latent structures discriminant analysis (OPLS-DA) on the serum NMR profiles and fatty acid composition (determined by GC/MS) showed a clear discrimination between HFS groups and controls. This discrimination involved metabolites such as glucose, amino acids, pyruvate, creatine, phosphocholine/glycerophosphocholine, ketone bodies and glycoproteins as well as an increase of monounsaturated fatty acids (MUFAs) and a decrease of n-6 and n-3 polyunsaturated fatty acids (PUFAs). Although the administration of Curcuma longa did not prevent the observed increase of glucose, triglycerides, cholesterol and insulin levels, discriminating metabolites were observed between groups fed HFS alone or with addition of a Curcuma longa extract, namely some MUFA and n-3 PUFA, glycoproteins, glutamine, and methanol, suggesting that curcuminoids may act respectively on the fatty acid metabolism, the hexosamine biosynthesis pathway and alcohol oxidation. Curcuma longa extract supplementation appears to be beneficial in these metabolic pathways in rats. This metabolomic approach highlights important serum metabolites that could help in understanding further the metabolic mechanisms leading to IR.

  12. iMir: an integrated pipeline for high-throughput analysis of small non-coding RNA data obtained by smallRNA-Seq.

    Science.gov (United States)

    Giurato, Giorgio; De Filippo, Maria Rosaria; Rinaldi, Antonio; Hashim, Adnan; Nassa, Giovanni; Ravo, Maria; Rizzo, Francesca; Tarallo, Roberta; Weisz, Alessandro

    2013-12-13

    RNAs. In addition, iMir allowed also the identification of ~70 piRNAs (piwi-interacting RNAs), some of which differentially expressed in proliferating vs growth arrested cells. The integrated data analysis pipeline described here is based on a reliable, flexible and fully automated workflow, useful to rapidly and efficiently analyze high-throughput smallRNA-Seq data, such as those produced by the most recent high-performance next generation sequencers. iMir is available at http://www.labmedmolge.unisa.it/inglese/research/imir.

  13. Identification and characterization of cold-responsive microRNAs in tea plant (Camellia sinensis) and their targets using high-throughput sequencing and degradome analysis.

    Science.gov (United States)

    Zhang, Yue; Zhu, Xujun; Chen, Xuan; Song, Changnian; Zou, Zhongwei; Wang, Yuhua; Wang, Mingle; Fang, Wanping; Li, Xinghui

    2014-10-21

    MicroRNAs (miRNAs) are approximately 19 ~ 21 nucleotide noncoding RNAs produced by Dicer-catalyzed excision from stem-loop precursors. Many plant miRNAs have critical functions in development, nutrient homeostasis, abiotic stress responses, and pathogen responses via interaction with specific target mRNAs. Camellia sinensis is one of the most important commercial beverage crops in the world. However, miRNAs associated with cold stress tolerance in C. sinensis remains unexplored. The use of high-throughput sequencing can provide a much deeper understanding of miRNAs. To obtain more insight into the function of miRNAs in cold stress tolerance, Illumina sequencing of C. sinensis sRNA was conducted. Solexa sequencing technology was used for high-throughput sequencing of the small RNA library from the cold treatment of tea leaves. To align the sequencing data with known plant miRNAs, we characterized 106 conserved C. sinensis miRNAs. In addition, 215 potential candidate miRNAs were found, among, which 98 candidates with star sequences were chosen as novel miRNAs. Both congruously and differentially regulated miRNAs were obtained, and cultivar-specific miRNAs were identified by microarray-based hybridization in response to cold stress. The results were also confirmed by quantitative real-time polymerase chain reaction. To confirm the targets of miRNAs, two degradome libraries from two treatments were constructed. According to degradome sequencing, 455 and 591 genes were identified as cleavage targets of miRNAs from cold treatments and control libraries, respectively, and 283 targets were present in both libraries. Functional analysis of these miRNA targets indicated their involvement in important activities, such as development, regulation of transcription, and stress response. We discovered 31 up-regulated miRNAs and 43 down-regulated miRNAs in 'Yingshuang', and 46 up-regulated miRNA and 45 down-regulated miRNAs in 'Baiye 1' in response to cold stress, respectively. A

  14. Identification and characterization of microRNAs related to salt stress in broccoli, using high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Tian, Yunhong; Tian, Yunming; Luo, Xiaojun; Zhou, Tao; Huang, Zuoping; Liu, Ying; Qiu, Yihan; Hou, Bing; Sun, Dan; Deng, Hongyu; Qian, Shen; Yao, Kaitai

    2014-09-03

    MicroRNAs (miRNAs) are a new class of endogenous regulators of a broad range of physiological processes, which act by regulating gene expression post-transcriptionally. The brassica vegetable, broccoli (Brassica oleracea var. italica), is very popular with a wide range of consumers, but environmental stresses such as salinity are a problem worldwide in restricting its growth and yield. Little is known about the role of miRNAs in the response of broccoli to salt stress. In this study, broccoli subjected to salt stress and broccoli grown under control conditions were analyzed by high-throughput sequencing. Differential miRNA expression was confirmed by real-time reverse transcription polymerase chain reaction (RT-PCR). The prediction of miRNA targets was undertaken using the Kyoto Encyclopedia of Genes and Genomes (KEGG) Orthology (KO) database and Gene Ontology (GO)-enrichment analyses. Two libraries of small (or short) RNAs (sRNAs) were constructed and sequenced by high-throughput Solexa sequencing. A total of 24,511,963 and 21,034,728 clean reads, representing 9,861,236 (40.23%) and 8,574,665 (40.76%) unique reads, were obtained for control and salt-stressed broccoli, respectively. Furthermore, 42 putative known and 39 putative candidate miRNAs that were differentially expressed between control and salt-stressed broccoli were revealed by their read counts and confirmed by the use of stem-loop real-time RT-PCR. Amongst these, the putative conserved miRNAs, miR393 and miR855, and two putative candidate miRNAs, miR3 and miR34, were the most strongly down-regulated when broccoli was salt-stressed, whereas the putative conserved miRNA, miR396a, and the putative candidate miRNA, miR37, were the most up-regulated. Finally, analysis of the predicted gene targets of miRNAs using the GO and KO databases indicated that a range of metabolic and other cellular functions known to be associated with salt stress were up-regulated in broccoli treated with salt. A comprehensive

  15. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    Science.gov (United States)

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu.

  16. High-throughput investigation of single and binary protein adsorption isotherms in anion exchange chromatography employing multivariate analysis.

    Science.gov (United States)

    Field, Nicholas; Konstantinidis, Spyridon; Velayudhan, Ajoy

    2017-08-11

    The combination of multi-well plates and automated liquid handling is well suited to the rapid measurement of the adsorption isotherms of proteins. Here, single and binary adsorption isotherms are reported for BSA, ovalbumin and conalbumin on a strong anion exchanger over a range of pH and salt levels. The impact of the main experimental factors at play on the accuracy and precision of the adsorbed protein concentrations is quantified theoretically and experimentally. In addition to the standard measurement of liquid concentrations before and after adsorption, the amounts eluted from the wells are measured directly. This additional measurement corroborates the calculation based on liquid concentration data, and improves precision especially under conditions of weak or moderate interaction strength. The traditional measurement of multicomponent isotherms is limited by the speed of HPLC analysis; this analytical bottleneck is alleviated by careful multivariate analysis of UV spectra. Copyright © 2017. Published by Elsevier B.V.

  17. MSP-HTPrimer: a high-throughput primer design tool to improve assay design for DNA methylation analysis in epigenetics

    OpenAIRE

    Pandey, Ram Vinay; Pulverer, Walter; Kallmeyer, Rainer; Beikircher, Gabriel; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Background Bisulfite (BS) conversion-based and methylation-sensitive restriction enzyme (MSRE)-based PCR methods have been the most commonly used techniques for locus-specific DNA methylation analysis. However, both methods have advantages and limitations. Thus, an integrated approach would be extremely useful to quantify the DNA methylation status successfully with great sensitivity and specificity. Designing specific and optimized primers for target regions is the most critical and challeng...

  18. A network-analysis-based comparative study of the throughput behavior of polymer melts in barrier screw geometries

    Science.gov (United States)

    Aigner, M.; Köpplmayr, T.; Kneidinger, C.; Miethlinger, J.

    2014-05-01

    Barrier screws are widely used in the plastics industry. Due to the extreme diversity of their geometries, describing the flow behavior is difficult and rarely done in practice. We present a systematic approach based on networks that uses tensor algebra and numerical methods to model and calculate selected barrier screw geometries in terms of pressure, mass flow, and residence time. In addition, we report the results of three-dimensional simulations using the commercially available ANSYS Polyflow software. The major drawbacks of three-dimensional finite-element-method (FEM) simulations are that they require vast computational power and, large quantities of memory, and consume considerable time to create a geometric model created by computer-aided design (CAD) and complete a flow calculation. Consequently, a modified 2.5-dimensional finite volume method, termed network analysis is preferable. The results obtained by network analysis and FEM simulations correlated well. Network analysis provides an efficient alternative to complex FEM software in terms of computing power and memory consumption. Furthermore, typical barrier screw geometries can be parameterized and used for flow calculations without timeconsuming CAD-constructions.

  19. Label-Free Analysis of Single Viruses with a Resolution Comparable to That of Electron Microscopy and the Throughput of Flow Cytometry.

    Science.gov (United States)

    Ma, Ling; Zhu, Shaobin; Tian, Ye; Zhang, Wenqiang; Wang, Shuo; Chen, Chaoxiang; Wu, Lina; Yan, Xiaomei

    2016-08-22

    Viruses are by far the most abundant biological entities on our planet, yet existing characterization methods are limited by either their speed or lack of resolution. By applying a laboratory-built high-sensitivity flow cytometer (HSFCM) to precisely quantify the extremely weak elastically scattered light from single viral particles, we herein report the label-free analysis of viruses with a resolution comparable to that of electron microscopy and the throughput of flow cytometry. The detection of single viruses with diameters down to 27 nm is described. T7 and lambda bacteriophages, which differ in size by as little as 4 nm, could be baseline-resolved. Moreover, subtle structural differences of the same viral particles can be discriminated. Using monodisperse silica nanoparticles as the size reference standards, the virus sizes measured by the HSFCM are in agreement with the equivalent particle diameters derived from their structural dimensions. The HSFCM opens a new avenue for virus characterization. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Asynchronous progression through the lytic cascade and variations in intracellular viral loads revealed by high-throughput single-cell analysis of Kaposi's sarcoma-associated herpesvirus infection.

    Science.gov (United States)

    Adang, Laura A; Parsons, Christopher H; Kedes, Dean H

    2006-10-01

    Kaposi's sarcoma-associated herpesvirus (KSHV or human herpesvirus-8) is frequently tumorigenic in immunocompromised patients. The average intracellular viral copy number within infected cells, however, varies markedly by tumor type. Since the KSHV-encoded latency-associated nuclear antigen (LANA) tethers viral episomes to host heterochromatin and displays a punctate pattern by fluorescence microscopy, we investigated whether accurate quantification of individual LANA dots is predictive of intracellular viral genome load. Using a novel technology that integrates single-cell imaging with flow cytometry, we found that both the number and the summed immunofluorescence of individual LANA dots are directly proportional to the amount of intracellular viral DNA. Moreover, combining viral (immediate early lytic replication and transcription activator [RTA] and late lytic K8.1) and cellular (syndecan-1) staining with image-based flow cytometry, we were also able to rapidly and simultaneously distinguish among cells supporting latent, immediate early lytic, early lytic, late lytic, and a potential fourth "delayed late" category of lytic replication. Applying image-based flow cytometry to KSHV culture models, we found that de novo infection results in highly varied levels of intracellular viral load and that lytic induction of latently infected cells likewise leads to a heterogeneous population at various stages of reactivation. These findings additionally underscore the potential advantages of studying KSHV biology with high-throughput analysis of individual cells.

  1. Structure-based high-throughput epitope analysis of hexon proteins in B and C species human adenoviruses (HAdVs.

    Directory of Open Access Journals (Sweden)

    Xiao-Hui Yuan

    Full Text Available Human adenoviruses (HAdVs are the etiologic agent of many human infectious diseases. The existence of at least 54 different serotypes of HAdVs has resulted in difficulties in clinical diagnosis. Acute respiratory tract disease (ARD caused by some serotypes from B and C species is particularly serious. Hexon, the main coat protein of HAdV, contains the major serotype-specific B cell epitopes; however, few studies have addressed epitope mapping in most HAdV serotypes. In this study, we utilized a novel and rapid method for the modeling of homologous proteins based on the phylogenetic tree of protein families and built three-dimensional (3D models of hexon proteins in B and C species HAdVs. Based on refined hexon structures, we used reverse evolutionary trace (RET bioinformatics analysis combined with a specially designed hexon epitope screening algorithm to achieve high-throughput epitope mapping of all 13 hexon proteins in B and C species HAdVs. This study has demonstrated that all of the epitopes from the 13 hexon proteins are located in the proteins' tower regions; however, the exact number, location, and size of the epitopes differ among the HAdV serotypes.

  2. SATURATED ZONE IN-SITU TESTING

    Energy Technology Data Exchange (ETDEWEB)

    P.W. REIMUS

    2004-11-08

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain, Nevada. The test interpretations provide estimates of flow and transport parameters used in the development of parameter distributions for total system performance assessment (TSPA) calculations. These parameter distributions are documented in ''Site-Scale Saturated Zone Flow Model (BSC 2004 [DIRS 170037]), Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]), Saturated Zone Colloid Transport (BSC 2004 [DIRS 170006]), and ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, this scientific analysis contributes the following to the assessment of the capability of the SZ to serve as part of a natural barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvial Testing Complex (ATC) located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass

  3. Comprehensive processing of high-throughput small RNA sequencing data including quality checking, normalization, and differential expression analysis using the UEA sRNA Workbench.

    Science.gov (United States)

    Beckers, Matthew; Mohorianu, Irina; Stocks, Matthew; Applegate, Christopher; Dalmay, Tamas; Moulton, Vincent

    2017-06-01

    Recently, high-throughput sequencing (HTS) has revealed compelling details about the small RNA (sRNA) population in eukaryotes. These 20 to 25 nt noncoding RNAs can influence gene expression by acting as guides for the sequence-specific regulatory mechanism known as RNA silencing. The increase in sequencing depth and number of samples per project enables a better understanding of the role sRNAs play by facilitating the study of expression patterns. However, the intricacy of the biological hypotheses coupled with a lack of appropriate tools often leads to inadequate mining of the available data and thus, an incomplete description of the biological mechanisms involved. To enable a comprehensive study of differential expression in sRNA data sets, we present a new interactive pipeline that guides researchers through the various stages of data preprocessing and analysis. This includes various tools, some of which we specifically developed for sRNA analysis, for quality checking and normalization of sRNA samples as well as tools for the detection of differentially expressed sRNAs and identification of the resulting expression patterns. The pipeline is available within the UEA sRNA Workbench, a user-friendly software package for the processing of sRNA data sets. We demonstrate the use of the pipeline on a H. sapiens data set; additional examples on a B. terrestris data set and on an A. thaliana data set are described in the Supplemental Information A comparison with existing approaches is also included, which exemplifies some of the issues that need to be addressed for sRNA analysis and how the new pipeline may be used to do this. © 2017 Beckers et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  4. Integrated High Throughput Analysis Identifies GSK3 as a Crucial Determinant of p53-Mediated Apoptosis in Lung Cancer Cells.

    Science.gov (United States)

    Zhang, Yu; Zhu, Chenyang; Sun, Bangyao; Lv, Jiawei; Liu, Zhonghua; Liu, Shengwang; Li, Hai

    2017-01-01

    p53 dysfunction is frequently observed in lung cancer. Although restoring the tumour suppressor function of p53 is recently approved as a putative strategy for combating cancers, the lack of understanding of the molecular mechanism underlying p53-mediated lung cancer suppression has limited the application of p53-based therapies in lung cancer. Using RNA sequencing, we determined the transcriptional profile of human non-small cell lung carcinoma A549 cells after treatment with two p53-activating chemical compounds, nutlin and RITA, which could induce A549 cell cycle arrest and apoptosis, respectively. Bioinformatics analysis of genome-wide gene expression data showed that distinct transcription profiles were induced by nutlin and RITA and 66 pathways were differentially regulated by these two compounds. However, only two of these pathways, 'Adherens junction' and 'Axon guidance', were found to be synthetic lethal with p53 re-activation, as determined via integrated analysis of genome-wide gene expression profile and short hairpin RNA (shRNA) screening. Further functional protein association analysis of significantly regulated genes associated with these two synthetic lethal pathways indicated that GSK3 played a key role in p53-mediated A549 cell apoptosis, and then gene function study was performed, which revealed that GSK3 inhibition promoted p53-mediated A549 cell apoptosis in a p53 post-translational activity-dependent manner. Our findings provide us with new insights regarding the mechanism by which p53 mediates A549 apoptosis and may cast light on the development of more efficient p53-based strategies for treating lung cancer. © 201 The Author(s). Published by S. Karger AG, Basel.

  5. High-throughput de novo screening of receptor agonists with an automated single-cell analysis and isolation system.

    Science.gov (United States)

    Yoshimoto, Nobuo; Tatematsu, Kenji; Iijima, Masumi; Niimi, Tomoaki; Maturana, Andrés D; Fujii, Ikuo; Kondo, Akihiko; Tanizawa, Katsuyuki; Kuroda, Shun'ichi

    2014-02-28

    Reconstitution of signaling pathways involving single mammalian transmembrane receptors has not been accomplished in yeast cells. In this study, intact EGF receptor (EGFR) and a cell wall-anchored form of EGF were co-expressed on the yeast cell surface, which led to autophosphorylation of the EGFR in an EGF-dependent autocrine manner. After changing from EGF to a conformationally constrained peptide library, cells were fluorescently labeled with an anti-phospho-EGFR antibody. Each cell was subjected to an automated single-cell analysis and isolation system that analyzed the fluorescent intensity of each cell and automatically retrieved each cell with the highest fluorescence. In ~3.2 × 10(6) peptide library, we isolated six novel peptides with agonistic activity of the EGFR in human squamous carcinoma A431 cells. The combination of yeast cells expressing mammalian receptors, a cell wall-anchored peptide library, and an automated single-cell analysis and isolation system might facilitate a rational approach for de novo drug screening.

  6. Spheroid arrays for high-throughput single-cell analysis of spatial patterns and biomarker expression in 3D.

    Science.gov (United States)

    Ivanov, Delyan P; Grabowska, Anna M

    2017-01-30

    We describe and share a device, methodology and image analysis algorithms, which allow up to 66 spheroids to be arranged into a gel-based array directly from a culture plate for downstream processing and analysis. Compared to processing individual samples, the technique uses 11-fold less reagents, saves time and enables automated imaging. To illustrate the power of the technology, we showcase applications of the methodology for investigating 3D spheroid morphology and marker expression and for in vitro safety and efficacy screens. First, spheroid arrays of 11 cell-lines were rapidly assessed for differences in spheroid morphology. Second, highly-positive (SOX-2), moderately-positive (Ki-67) and weakly-positive (βIII-tubulin) protein targets were detected and quantified. Third, the arrays enabled screening of ten media compositions for inducing differentiation in human neurospheres. Last, the application of spheroid microarrays for spheroid-based drug screens was demonstrated by quantifying the dose-dependent drop in proliferation and increase in differentiation in etoposide-treated neurospheres.

  7. High-Throughput Analysis of Age-Dependent Protein Changes in Layer II/III of the Human Orbitofrontal Cortex

    Science.gov (United States)

    Kapadia, Fenika

    Studies on the orbitofrontal cortex (OFC) during normal aging have shown a decline in cognitive functions, a loss of spines/synapses in layer III and gene expression changes related to neural communication. Biological changes during the course of normal aging are summarized into 9 hallmarks based on aging in peripheral tissue. Whether these hallmarks apply to non-dividing brain tissue is not known. Therefore, we opted to perform large-scale proteomic profiling of the OFC layer II/III during normal aging from 15 young and 18 old male subjects. MaxQuant was utilized for label-free quantification and statistical analysis by the Random Intercept Model (RIM) identified 118 differentially expressed (DE) age-related proteins. Altered neural communication was the most represented hallmark of aging (54% of DE proteins), highlighting the importance of communication in the brain. Functional analysis showed enrichment in GABA/glutamate signaling and pro-inflammatory responses. The former may contribute to alterations in excitation/inhibition, leading to cognitive decline during aging.

  8. Implementation and Analysis of a Lean Six Sigma Program in Microsurgery to Improve Operative Throughput in Perforator Flap Breast Reconstruction.

    Science.gov (United States)

    Hultman, Charles Scott; Kim, Sendia; Lee, Clara N; Wu, Cindy; Dodge, Becky; Hultman, Chloe Elizabeth; Roach, S Tanner; Halvorson, Eric G

    2016-06-01

    Perforator flaps have become a preferred method of breast reconstruction but can consume considerable resources. We examined the impact of a Six Sigma program on microsurgical breast reconstruction at an academic medical center. Using methods developed by Motorola and General Electric, we applied critical pathway planning, workflow analysis, lean manufacturing, continuous quality improvement, and defect reduction to microsurgical breast reconstruction. Primary goals were to decrease preoperative-to-cut time and total operative time, through reduced variability and improved efficiency. Secondary goals were to reduce length of stay, complications, and reoperation. The project was divided into 3 phases: (1) Pre-Six Sigma (24 months), (2) Six Sigma (10 months), (3) and Post-Six Sigma (24 months). These periods (baseline, intervention, control) were compared by Student t test and χ analysis. Over a 5-year period, 112 patients underwent 168 perforator flaps for breast reconstructions, by experienced microsurgeons. Total operative time decreased from 714 to 607 minutes (P Six Sigma program in microsurgical breast reconstruction was associated with better operational and financial outcomes. These incremental gains were maintained over the course of the study, suggesting that these benefits were due, in part, to process improvements. However, continued reductions in total operative time and length of stay, well after the intervention period, support the possibility that "learning curve" phenomenon may have contributed to the improvement in these outcomes.

  9. High-throughput analysis of promoter occupancy reveals new targets for Arx, a gene mutated in mental retardation and interneuronopathies.

    Directory of Open Access Journals (Sweden)

    Marie-Lise Quillé

    Full Text Available Genetic investigations of X-linked intellectual disabilities have implicated the ARX (Aristaless-related homeobox gene in a wide spectrum of disorders extending from phenotypes characterised by severe neuronal migration defects such as lissencephaly, to mild or moderate forms of mental retardation without apparent brain abnormalities but with associated features of dystonia and epilepsy. Analysis of Arx spatio-temporal localisation profile in mouse revealed expression in telencephalic structures, mainly restricted to populations of GABAergic neurons at all stages of development. Furthermore, studies of the effects of ARX loss of function in humans and animal models revealed varying defects, suggesting multiple roles of this gene during brain development. However, to date, little is known about how ARX functions as a transcription factor and the nature of its targets. To better understand its role, we combined chromatin immunoprecipitation and mRNA expression with microarray analysis and identified a total of 1006 gene promoters bound by Arx in transfected neuroblastoma (N2a cells and in mouse embryonic brain. Approximately 24% of Arx-bound genes were found to show expression changes following Arx overexpression or knock-down. Several of the Arx target genes we identified are known to be important for a variety of functions in brain development and some of them suggest new functions for Arx. Overall, these results identified multiple new candidate targets for Arx and should help to better understand the pathophysiological mechanisms of intellectual disability and epilepsy associated with ARX mutations.

  10. High-throughput gene expression analysis of intestinal intraepithelial lymphocytes after oral feeding of carvacrol, cinnamaldehyde, or Capsicum oleoresin.

    Science.gov (United States)

    Kim, D K; Lillehoj, H S; Lee, S H; Jang, S I; Bravo, D

    2010-01-01

    Among dietary phytonutrients, carvacrol, cinnamaldehyde, and Capsicum oleoresin are well known for their antiinflammatory and antibiotic effects in human and veterinary medicine. To further define the molecular and genetic mechanisms responsible for these properties, broiler chickens were fed a standard diet supplemented with either of the 3 phytochemicals and intestinal intraepithelial lymphocytes were examined for changes in gene expression by microarray analysis. When compared with chickens fed a nonsupplemented standard diet, carvacrol-fed chickens showed altered expression of 74 genes (26 upregulated, 48 downregulated) and cinnamaldehyde led to changes in the levels of mRNAs corresponding to 62 genes (31 upregulated, 31 downregulated). Most changes in gene expression were seen in the Capsicum-fed broilers with 98 upregulated and 156 downregulated genes compared with untreated controls. Results from the microarray analysis were confirmed by quantitative real-time PCR with a subset of selected genes. Among the genes that showed >2.0-fold altered mRNA levels, most were associated with metabolic pathways. In particular, with the genes altered by Capsicum oleoresin, the highest scored molecular network included genes associated with lipid metabolism, small molecule biochemistry, and cancer. In conclusion, this study provides a foundation to further investigate specific chicken genes that are expressed in response to a diet containing carvacrol, cinnamaldehyde, or Capsicum oleoresin.

  11. Accuracy of oxygen saturation and total hemoglobin estimates in the neonatal brain using the semi-infinite slab model for FD-NIRS data analysis.

    Science.gov (United States)

    Barker, Jeffrey W; Panigrahy, Ashok; Huppert, Theodore J

    2014-12-01

    Frequency domain near-infrared spectroscopy (FD-NIRS) is a non-invasive method for measuring optical absorption in the brain. Common data analysis procedures for FD-NIRS data assume the head is a semi-infinite, homogenous medium. This assumption introduces bias in estimates of absorption (μa ), scattering ( [Formula: see text]), tissue oxygen saturation (StO2), and total hemoglobin (HbT). Previous works have investigated the accuracy of recovered μa values under this assumption. The purpose of this study was to examine the accuracy of recovered StO2 and HbT values in FD-NIRS measurements of the neonatal brain. We used Monte Carlo methods to compute light propagation through a neonate head model in order to simulate FD-NIRS measurements at 690 nm and 830 nm. We recovered μa , [Formula: see text], StO2, and HbT using common analysis procedures that assume a semi-infinite, homogenous medium and compared the recovered values to simulated values. Additionally, we characterized the effects of curvature via simulations on homogenous spheres of varying radius. Lastly, we investigated the effects of varying amounts of extra-axial fluid. Curvature induced underestimation of μa , [Formula: see text], and HbT, but had minimal effects on StO2. For the morphologically normal neonate head model, the mean absolute percent errors (MAPE) of recovered μa values were 12% and 7% for 690 nm and 830 nm, respectively, when source-detector separation was at least 20 mm. The MAPE for recovered StO2 and HbT were 6% and 9%, respectively. Larger relative errors were observed (∼20-30%), especially as StO2 and HbT deviated from normal values. Excess CSF around the brain caused very large errors in μa , [Formula: see text], and HbT, but had little effect on StO2.

  12. Network analysis of the microorganism in 25 Danish wastewater treatment plants over 7 years using high-throughput amplicon sequencing

    DEFF Research Database (Denmark)

    Albertsen, Mads; Larsen, Poul; Saunders, Aaron Marc

    to link sludge and floc properties to the microbial communities. All data was subjected to extensive network analysis and multivariate statistics through R. The 16S amplicon results confirmed the findings of relatively few core groups of organism shared by all the wastewater treatment plants......Wastewater treatment is the world’s largest biotechnological processes and a perfect model system for microbial ecology as the habitat is well defined and replicated all over the world. Extensive investigations on Danish wastewater treatment plants using fluorescent in situ hybridization have...... identified 38 probe-defined core genera, which are shared among all investigated Danish plants. A large body of knowledge exists on many of the core genera, however few attempts have been made to integrate the knowledge on a system-level understanding of the process. In this work we aimed to integrate...

  13. A high-throughput analysis of the IDH1(R132H) protein expression in pituitary adenomas

    DEFF Research Database (Denmark)

    Casar-Borota, Olivera; Øystese, Kristin Astrid Berland; Sundström, Magnus

    2016-01-01

    PURPOSE: Inactivating mutations of isocitrate dehydrogenase (IDH) 1 and 2, mitochondrial enzymes participating in the Krebs tricarboxylic acid cycle play a role in the tumorigenesis of gliomas and also less frequently in acute myeloid leukemia and other malignancies. Inhibitors of mutant IDH1...... and IDH2 may potentially be effective in the treatment of the IDH mutation driven tumors. Mutations in the succinate dehydrogenase, the other enzyme complex participating in the Krebs cycle and electron transfer of oxidative phosphorylation occur in the paragangliomas, gastrointestinal stromal tumors......, and occasionally in the pituitary adenomas. We aimed to determine whether the IDH1(R132H) mutation, the most frequent IDH mutation in human malignancies, occurs in pituitary adenomas. METHODS: We performed immunohistochemical analysis by using a monoclonal anti-IDH1(R132H) antibody on the tissue microarrays...

  14. High throughput analysis reveals dissociable gene expression profiles in two independent neural systems involved in the regulation of social behavior

    Directory of Open Access Journals (Sweden)

    Stevenson Tyler J

    2012-10-01

    Full Text Available Abstract Background Production of contextually appropriate social behaviors involves integrated activity across many brain regions. Many songbird species produce complex vocalizations called ‘songs’ that serve to attract potential mates, defend territories, and/or maintain flock cohesion. There are a series of discrete interconnect brain regions that are essential for the successful production of song. The probability and intensity of singing behavior is influenced by the reproductive state. The objectives of this study were to examine the broad changes in gene expression in brain regions that control song production with a brain region that governs the reproductive state. Results We show using microarray cDNA analysis that two discrete brain systems that are both involved in governing singing behavior show markedly different gene expression profiles. We found that cortical and basal ganglia-like brain regions that control the socio-motor production of song in birds exhibit a categorical switch in gene expression that was dependent on their reproductive state. This pattern is in stark contrast to the pattern of expression observed in a hypothalamic brain region that governs the neuroendocrine control of reproduction. Subsequent gene ontology analysis revealed marked variation in the functional categories of active genes dependent on reproductive state and anatomical localization. HVC, one cortical-like structure, displayed significant gene expression changes associated with microtubule and neurofilament cytoskeleton organization, MAP kinase activity, and steroid hormone receptor complex activity. The transitions observed in the preoptic area, a nucleus that governs the motivation to engage in singing, exhibited variation in functional categories that included thyroid hormone receptor activity, epigenetic and angiogenetic processes. Conclusions These findings highlight the importance of considering the temporal patterns of gene expression

  15. Optimized automated data analysis for the cytokinesis‐block micronucleus assay using imaging flow cytometry for high throughput radiation biodosimetry

    Science.gov (United States)

    Rodrigues, M. A.; Probst, C. E.; Beaton‐Green, L. A.

    2016-01-01

    Abstract The cytokinesis‐block micronucleus (CBMN) assay is a well‐established technique that can be employed in triage radiation biodosimetry to estimate whole body doses of radiation to potentially exposed individuals through quantitation of the frequency of micronuclei (MN) in binucleated lymphocyte cells (BNCs). The assay has been partially automated using traditional microscope‐based methods and most recently has been modified for application on the ImageStreamX (ISX) imaging flow cytometer. This modification has allowed for a similar number of BNCs to be automatically scored as compared to traditional microscopy in a much shorter time period. However, the MN frequency measured was much lower than both manual and automated slide‐based methods of performing the assay. This work describes the optimized analysis template which implements newly developed functions in the IDEAS® data analysis software for the ISX that enhances specificity for BNCs and increases the frequency of scored MN. A new dose response calibration curve is presented in which the average rate of MN per BNC is of similar magnitude to those presented in the literature using automated CBMN slide scoring methods. In addition, dose estimates were generated for nine irradiated, blinded samples and were found to be within ±0.5 Gy of the delivered dose. Results demonstrate that the improved identification accuracy for MN and BNCs in the ISX‐based version of the CBMN assay will translate to increased accuracy when estimating unknown radiation doses received by exposed individuals following large‐scale radiological or nuclear emergencies. © 2016 The Authors. Cytometry Part A published by Wiley Periodicals, Inc. on behalf of ISAC PMID:27272602

  16. Identification of DNA sequence variation in Campylobacter jejuni strains associated with the Guillain-Barré syndrome by high-throughput AFLP analysis

    Directory of Open Access Journals (Sweden)

    Endtz Hubert P

    2006-04-01

    Full Text Available Abstract Background Campylobacter jejuni is the predominant cause of antecedent infection in post-infectious neuropathies such as the Guillain-Barré (GBS and Miller Fisher syndromes (MFS. GBS and MFS are probably induced by molecular mimicry between human gangliosides and bacterial lipo-oligosaccharides (LOS. This study describes a new C. jejuni-specific high-throughput AFLP (htAFLP approach for detection and identification of DNA polymorphism, in general, and of putative GBS/MFS-markers, in particular. Results We compared 6 different isolates of the "genome strain" NCTC 11168 obtained from different laboratories. HtAFLP analysis generated approximately 3000 markers per stain, 19 of which were polymorphic. The DNA polymorphisms could not be confirmed by PCR-RFLP analysis, suggesting a baseline level of 0.6% AFLP artefacts. Comparison of NCTC 11168 with 4 GBS-associated strains revealed 23 potentially GBS-specific markers, 17 of which were identified by DNA sequencing. A collection of 27 GBS/MFS-associated and 17 enteritis control strains was analyzed with PCR-RFLP tests based on 11 of these markers. We identified 3 markers, located in the LOS biosynthesis genes cj1136, cj1138 and cj1139c, that were significantly associated with GBS (P = 0.024, P = 0.047 and P Conclusion This study shows that bacterial GBS markers are limited in number and located in the LOS biosynthesis genes, which corroborates the current consensus that LOS mimicry may be the prime etiologic determinant of GBS. Furthermore, our results demonstrate that htAFLP, with its high reproducibility and resolution, is an effective technique for the detection and subsequent identification of putative bacterial disease markers.

  17. Identification of novel and conserved miRNAs involved in pollen development in Brassica campestris ssp. chinensis by high-throughput sequencing and degradome analysis

    Science.gov (United States)

    2014-01-01

    Background microRNAs (miRNAs) are endogenous, noncoding, small RNAs that have essential regulatory functions in plant growth, development, and stress response processes. However, limited information is available about their functions in sexual reproduction of flowering plants. Pollen development is an important process in the life cycle of a flowering plant and is a major factor that affects the yield and quality of crop seeds. Results This study aims to identify miRNAs involved in pollen development. Two independent small RNA libraries were constructed from the flower buds of the male sterile line (Bcajh97-01A) and male fertile line (Bcajh97-01B) of Brassica campestris ssp. chinensis. The libraries were subjected to high-throughput sequencing by using the Illumina Solexa system. Eight novel miRNAs on the other arm of known pre-miRNAs, 54 new conserved miRNAs, and 8 novel miRNA members were identified. Twenty-five pairs of novel miRNA/miRNA* were found. Among all the identified miRNAs, 18 differentially expressed miRNAs with over two-fold change between flower buds of male sterile line (Bcajh97-01A) and male fertile line (Bcajh97-01B) were identified. qRT-PCR analysis revealed that most of the differentially expressed miRNAs were preferentially expressed in flower buds of the male fertile line (Bcajh97-01B). Degradome analysis showed that a total of 15 genes were predicted to be the targets of seven miRNAs. Conclusions Our findings provide an overview of potential miRNAs involved in pollen development and interactions between miRNAs and their corresponding targets, which may provide important clues on the function of miRNAs in pollen development. PMID:24559317

  18. Identification and genetic characterization by high-throughput SNP analysis of intervarietal substitution lines of rapeseed (Brassica napus L.) with enhanced embryogenic potential.

    Science.gov (United States)

    Ecke, Wolfgang; Kampouridis, Anthimos; Ziese-Kubon, Katharina; Hirsch, Ann-Catrin

    2015-04-01

    Seven intervarietal substitution lines were identified with embryogenic potentials up to 40.4 times that of the recurrent parent, providing an ideal material for further in depth studies of this trait. To identify genomic regions that carry genetic factors controlling embryogenic potential of isolated microspores of rapeseed, marker segregations were analysed in a segregating population of haploid microspore-derived embryos and a BC1 population from a cross between 'Express 617' and 'RS239'. After map construction 15 intervarietal substitution lines from the same cross with 'Express 617' as recurrent parent were selected with donor segments covering five genomic regions that had shown skewed segregations in the population of microspore-derived embryos but not in the BC1 population. By comparing the embryogenic potential of microspores of the 15 substitution lines and 'Express 617', seven lines were identified with significantly enhanced embryogenic potential ranging from 4.1 to 40.4 times that of 'Express 617'. To improve the genetic characterization of the selected lines, they were subjected to a high-throughput SNP analysis using the Illumina Infinium 60K chip for rapeseed. Based on 7,960 mapped SNP markers, one to eight donor segments per line, which cover 0.64-6.79% of the 2,126.1 cM of the SNP map, were found. The SNP analysis also gave evidence that homoeologous exchanges had occurred during the development of the substitution line population, increasing the genetic diversity within this population. By comparing donor segments between lines with significantly enhanced embryogenic potential and non-significant lines, 12 genomic regions were identified that may contain genetic factors controlling embryogenic potential in rapeseed. These regions range in size from 0 (represented by just one marker) to 26.8 cM and cover together just 5.42% of the SNP map.

  19. Development of a simple fluorescence-based microplate method for the high-throughput analysis of proline in wine samples.

    Science.gov (United States)

    Robert-Peillard, Fabien; Boudenne, Jean-Luc; Coulomb, Bruno

    2014-05-01

    This paper presents a simple, accurate and multi-sample method for the determination of proline in wines thanks to a 96-well microplate technique. Proline is the most abundant amino acid in wine and is an important parameter related to wine characteristics or maturation processes of grape. In the current study, an improved application of the general method based on sodium hypochlorite oxidation and o-phthaldialdehyde (OPA)-thiol spectrofluorometric detection is described. The main interfering compounds for specific proline detection in wines are strongly reduced by selective reaction with OPA in a preliminary step under well-defined pH conditions. Application of the protocol after a 500-fold dilution of wine samples provides a working range between 0.02 and 2.90gL(-1), with a limit of detection of 7.50mgL(-1). Comparison and validation on real wine samples by ion-exchange chromatography prove that this procedure yields accurate results. Simplicity of the protocol used, with no need for centrifugation or filtration, organic solvents or high temperature enables its full implementation in plastic microplates and efficient application for routine analysis of proline in wines. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. High-throughput analysis by SP-LDI-MS for fast identification of adulterations in commercial balsamic vinegars

    Energy Technology Data Exchange (ETDEWEB)

    Guerreiro, Tatiane Melina; Oliveira, Diogo Noin de; Ferreira, Mônica Siqueira; Catharino, Rodrigo Ramos, E-mail: rrc@fcm.unicamp.br

    2014-08-01

    Highlights: • Rapid identification of adulteration in balsamic vinegars. • Minimal sample preparation. • No matrix required for assisting laser desorption/ionization. • Fast sample discrimination by multivariate data analysis. - Abstract: Balsamic vinegar (BV) is a typical and valuable Italian product, worldwide appreciated thanks to its characteristic flavors and potential health benefits. Several studies have been conducted to assess physicochemical and microbial compositions of BV, as well as its beneficial properties. Due to highly-disseminated claims of antioxidant, antihypertensive and antiglycemic properties, BV is a known target for frauds and adulterations. For that matter, product authentication, certifying its origin (region or country) and thus the processing conditions, is becoming a growing concern. Striving for fraud reduction as well as quality and safety assurance, reliable analytical strategies to rapidly evaluate BV quality are very interesting, also from an economical point of view. This work employs silica plate laser desorption/ionization mass spectrometry (SP-LDI-MS) for fast chemical profiling of commercial BV samples with protected geographical indication (PGI) and identification of its adulterated samples with low-priced vinegars, namely apple, alcohol and red/white wines.

  1. Highly Sensitive and High-Throughput Method for the Analysis of Bisphenol Analogues and Their Halogenated Derivatives in Breast Milk.

    Science.gov (United States)

    Niu, Yumin; Wang, Bin; Zhao, Yunfeng; Zhang, Jing; Shao, Bing

    2017-12-06

    The structural analogs of bisphenol A (BPA) and their halogenated derivatives (together termed BPs) have been found in the environment, food, and even the human body. Limited research showed that some of them exhibited toxicities that were similar to or even greater than that of BPA. Therefore, adverse health effects for BPs were expected for humans with low-dose exposure in early life. Breast milk is an excellent matrix and could reflect fetuses' and babies' exposure to contaminants. Some of the emerging BPs may present with trace or ultratrace levels in humans. However, existing analytical methods for breast milk cannot quantify these BPs simultaneously with high sensitivity using a small sampling weight, which is important for human biomonitoring studies. In this paper, a method based on Bond Elut Enhanced Matrix Removal-Lipid purification, pyridine-3-sulfonyl chloride derivatization, and liquid chromatography electrospray tandem mass spectrometry was developed. The method requires only a small quantity of sample (200 μL) and allowed for the simultaneous determination of 24 BPs in breast milk with ultrahigh sensitivity. The limits of quantitation of the proposed method were 0.001-0.200 μg L-1, which were 1-6.7 times lower than the only study for the simultaneous analysis of bisphenol analogs in breast milk based on a 3 g sample weight. The mean recoveries ranged from 86.11% to 119.05% with relative standard deviation (RSD) ≤ 19.5% (n = 6). Matrix effects were within 20% with RSD bisphenol F (BPF), bisphenol S (BPS), and bisphenol AF (BPAF) were detected. BPA was still the dominant BP, followed by BPF. This is the first report describing the occurrence of BPF and BPAF in breast milk.

  2. Effects on coronary heart disease of increasing polyunsaturated fat in place of saturated fat: a systematic review and meta-analysis of randomized controlled trials.

    Directory of Open Access Journals (Sweden)

    Dariush Mozaffarian

    2010-03-01

    Full Text Available Reduced saturated fat (SFA consumption is recommended to reduce coronary heart disease (CHD, but there is an absence of strong supporting evidence from randomized controlled trials (RCTs of clinical CHD events and few guidelines focus on any specific replacement nutrient. Additionally, some public health groups recommend lowering or limiting polyunsaturated fat (PUFA consumption, a major potential replacement for SFA.We systematically investigated and quantified the effects of increased PUFA consumption, as a replacement for SFA, on CHD endpoints in RCTs. RCTs were identified by systematic searches of multiple online databases through June 2009, grey literature sources, hand-searching related articles and citations, and direct contacts with experts to identify potentially unpublished trials. Studies were included if they randomized participants to increased PUFA for at least 1 year without major concomitant interventions, had an appropriate control group, and reported incidence of CHD (myocardial infarction and/or cardiac death. Inclusions/exclusions were adjudicated and data were extracted independently and in duplicate by two investigators and included population characteristics, control and intervention diets, follow-up duration, types of events, risk ratios, and SEs. Pooled effects were calculated using inverse-variance-weighted random effects meta-analysis. From 346 identified abstracts, eight trials met inclusion criteria, totaling 13,614 participants with 1,042 CHD events. Average weighted PUFA consumption was 14.9% energy (range 8.0%-20.7% in intervention groups versus 5.0% energy (range 4.0%-6.4% in controls. The overall pooled risk reduction was 19% (RR = 0.81, 95% confidence interval [CI] 0.70-0.95, p = 0.008, corresponding to 10% reduced CHD risk (RR = 0.90, 95% CI = 0.83-0.97 for each 5% energy of increased PUFA, without evidence for statistical heterogeneity (Q-statistic p = 0.13; I(2 = 37%. Meta-regression identified study

  3. Effects on coronary heart disease of increasing polyunsaturated fat in place of saturated fat: a systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Mozaffarian, Dariush; Micha, Renata; Wallace, Sarah

    2010-03-23

    Reduced saturated fat (SFA) consumption is recommended to reduce coronary heart disease (CHD), but there is an absence of strong supporting evidence from randomized controlled trials (RCTs) of clinical CHD events and few guidelines focus on any specific replacement nutrient. Additionally, some public health groups recommend lowering or limiting polyunsaturated fat (PUFA) consumption, a major potential replacement for SFA. We systematically investigated and quantified the effects of increased PUFA consumption, as a replacement for SFA, on CHD endpoints in RCTs. RCTs were identified by systematic searches of multiple online databases through June 2009, grey literature sources, hand-searching related articles and citations, and direct contacts with experts to identify potentially unpublished trials. Studies were included if they randomized participants to increased PUFA for at least 1 year without major concomitant interventions, had an appropriate control group, and reported incidence of CHD (myocardial infarction and/or cardiac death). Inclusions/exclusions were adjudicated and data were extracted independently and in duplicate by two investigators and included population characteristics, control and intervention diets, follow-up duration, types of events, risk ratios, and SEs. Pooled effects were calculated using inverse-variance-weighted random effects meta-analysis. From 346 identified abstracts, eight trials met inclusion criteria, totaling 13,614 participants with 1,042 CHD events. Average weighted PUFA consumption was 14.9% energy (range 8.0%-20.7%) in intervention groups versus 5.0% energy (range 4.0%-6.4%) in controls. The overall pooled risk reduction was 19% (RR = 0.81, 95% confidence interval [CI] 0.70-0.95, p = 0.008), corresponding to 10% reduced CHD risk (RR = 0.90, 95% CI = 0.83-0.97) for each 5% energy of increased PUFA, without evidence for statistical heterogeneity (Q-statistic p = 0.13; I(2) = 37%). Meta-regression identified study duration as an

  4. Breast MRI at Very Short TE (minTE): Image Analysis of minTE Sequences on Non-Fat-Saturated, Subtracted T1-Weighted Images.

    Science.gov (United States)

    Wenkel, Evelyn; Janka, Rolf; Geppert, Christian; Kaemmerer, Nadine; Hartmann, Arndt; Uder, Michael; Hammon, Matthias; Brand, Michael

    2017-02-01

    Purpose The aim was to evaluate a minimum echo time (minTE) protocol for breast magnetic resonance imaging (MRI) in patients with breast lesions compared to a standard TE (nTE) time protocol. Methods Breasts of 144 women were examined with a 1.5 Tesla MRI scanner. Additionally to the standard gradient-echo sequence with nTE (4.8 ms), a variant with minimum TE (1.2 ms) was used in an interleaved fashion which leads to a better temporal resolution and should reduce the scan time by approximately 50 %. Lesion sizes were measured and the signal-to-noise ratio (SNR) as well as the contrast-to-noise ratio (CNR) were calculated. Subjective confidence was evaluated using a 3-point scale before looking at the nTE sequences (1 = very sure that I can identify a lesion and classify it, 2 = quite sure that I can identify a lesion and classify it, 3 = definitely want to see nTE for final assessment) and the subjective image quality of all examinations was evaluated using a four-grade scale (1 = sharp, 2 = slight blur, 3 = moderate blur and 4 = severe blur/not evaluable) for lesion and skin sharpness. Lesion morphology and contrast enhancement were also evaluated. Results With minTE sequences, no lesion was rated with "definitely want to see nTE sequences for final assessment". The difference of the longitudinal and transverse diameter did not differ significantly (p > 0.05). With minTE, lesions and skin were rated to be significantly more blurry (p Image Analysis of minTE Sequences on Non-Fat-Saturated, Subtracted T1-Weighted Images. Fortschr Röntgenstr 2017; 189: 137 - 145. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Methane hydrate pore saturation evaluation from geophysical logging and pressure core analysis, at the first offshore production test site in the eastern Nankai Trough, Japan

    Science.gov (United States)

    Fujii, T.; Suzuki, K.; Takayama, T.; Konno, Y.; Yoneda, J.; Egawa, K.; Ito, T.; Nagao, J.

    2013-12-01

    On March 2013, the first offshore production test form methane hydrate (MH) concentrated zone (MHCZ) was conducted by the Research Consortium for Methane Hydrate Resource Development in Japan (MH21) at the AT1 site located in the north-western slope of Daini-Atsumi Knoll in the eastern Nankai Trough, Japan. Before the production test, extensive geophysical logging and pressure coring using Hybrid Pressure Coring System were conducted in 2012 at monitoring well (AT1-MC) and coring well (AT1-C), in order to obtain basic information for the MH reservoir characterization. MH pore saturation (Sh) is one of the important basic parameters not only for reservoir characterization, but also the resource assessment. However, precise evaluation of Sh from geophysical logging is still challenging technical issue. The MHCZ confirmed by the geophysical logging at AT1-MC has a turbidite assemblage (from several tens of centimeters to a few meters) with 60 m of gross thickness; it is composed of lobe/sheet type sequences in the upper part, and relatively thick channel sand sequences in the lower part. In this study, the Sh evaluated from geophysical logging data were compared with those evaluated from pressure core analysis. Resistivity logs and nuclear magnetic resonance (NMR) log were used for the Sh evaluation by geophysical logging. Standard Archie equation was applied for Sh evaluation from resistivity log, while density magnetic resonance (DMR) method was used for Sh evaluation from NMR log. The Sh from pressure core samples were evaluated using the amount of dissociated gas volume, together with core sample bulk volume, measured porosity, net sand intervals, and assumed methane solubility in pore water. In the upper part of the MHCZ, Sh estimated from resistivity log showed distinct difference in value between sand and mud layers, compared to Sh from NMR log. Resistivity log has higher vertical resolution than NMR log, so it is favorable for these kinds of thin bed

  6. Third Generation (3G) Site Characterization: Cryogenic Core Collection and High Throughput Core Analysis - An Addendum to Basic Research Addressing Contaminants in Low Permeability Zones - A State of the Science Review

    Science.gov (United States)

    2016-07-29

    Hollow-Stem Auger HTCA High-Throughput Core Analysis IC Ion Chromatograph ID Inner Diameter k Permeability LN Liquid Nitrogen LNAPL Light ...liner and the cooling system. • Applying food -grade oil to the outside of the sample liner to limit direct contact of water with the sample liner...strong magnetic field after radio frequency (RF) pulsing ; the resulting data can be used to determine the proton densities spatially throughout the

  7. Diagnostic throughput factor analysis for en-route airspace and optimal aircraft trajectory generation based on capacity prediction and controller workload

    Science.gov (United States)

    Shin, Sanghyun

    Today's National Airspace System (NAS) is approaching its limit to efficiently cope with the increasing air traffic demand. Next Generation Air Transportation System (NextGen) with its ambitious goals aims to make the air travel more predictable with fewer delays, less time sitting on the ground and holding in the air to improve the performance of the NAS. However, currently the performance of the NAS is mostly measured using delay-based metrics which do not capture a whole range of important factors that determine the quality and level of utilization of the NAS. The factors affecting the performance of the NAS are themselves not well defined to begin with. To address these issues, motivated by the use of throughput-based metrics in many areas such as ground transportation, wireless communication and manufacturing, this thesis identifies the different factors which majorly affect the performance of the NAS as demand (split into flight cancellation and flight rerouting), safe separation (split into conflict and metering) and weather (studied as convective weather) through careful comparison with other applications and performing empirical sensitivity analysis. Additionally, the effects of different factors on the NAS's performance are quantitatively studied using real traffic data with the Future ATM Concepts Evaluation Tool (FACET) for various sectors and centers of the NAS on different days. In this thesis we propose a diagnostic tool which can analyze the factors that have greater responsibility for regions of poor and better performances of the NAS. Based on the throughput factor analysis for en-route airspace, it was found that weather and controller workload are the major factors that decrease the efficiency of the airspace. Also, since resources such as air traffic controllers, infrastructure and airspace are limited, it is becoming increasingly important to use the available resources efficiently. To alleviate the impact of the weather and controller

  8. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  9. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    Science.gov (United States)

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  10. AlphaScreen-based homogeneous assay using a pair of 25-residue artificial proteins for high-throughput analysis of non-native IgG.

    Science.gov (United States)

    Senga, Yukako; Imamura, Hiroshi; Miyafusa, Takamitsu; Watanabe, Hideki; Honda, Shinya

    2017-09-29

    Therapeutic IgG becomes unstable under various stresses in the manufacturing process. The resulting non-native IgG molecules tend to associate with each other and form aggregates. Because such aggregates not only decrease the pharmacological effect but also become a potential risk factor for immunogenicity, rapid analysis of aggregation is required for quality control of therapeutic IgG. In this study, we developed a homogeneous assay using AlphaScreen and AF.2A1. AF.2A1 is a 25-residue artificial protein that binds specifically to non-native IgG generated under chemical and physical stresses. This assay is performed in a short period of time. Our results show that AF.2A1-AlphaScreen may be used to evaluate the various types of IgG, as AF.2A1 recognizes the non-native structure in the constant region (Fc region) of IgG. The assay was effective for detection of non-native IgG, with particle size up to ca. 500 nm, generated under acid, heat, and stirring conditions. In addition, this technique is suitable for analyzing non-native IgG in CHO cell culture supernatant and mixed with large amounts of native IgG. These results indicate the potential of AF.2A1-AlphaScreen to be used as a high-throughput evaluation method for process monitoring as well as quality testing in the manufacturing of therapeutic IgG.

  11. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data [version 3; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Damien Correia

    2016-12-01

    Full Text Available The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS or Next-Generation Sequencing (NGS technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS, solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power. Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration

  12. High-throughput, non-invasive prenatal testing for fetal RHD genotype to guide antenatal prophylaxis with anti-D immunoglobulin: a cost-effectiveness analysis.

    Science.gov (United States)

    Saramago, Pedro; Yang, Huiqin; Llewellyn, Alexis; Palmer, Stephen; Simmonds, Mark; Griffin, Susan

    2018-02-07

    To evaluate the cost-effectiveness of high-throughput, non-invasive prenatal testing (HT-NIPT) for fetal RhD genotype to guide antenatal prophylaxis with anti-D immunoglobulin compared to routine antenatal anti-D immunoglobulin prophylaxis (RAADP). Cost-effectiveness decision-analytic modelling. Primary care. A simulated population of 100,000 RhD negative women not known to be sensitised to the RhD antigen. A decision tree model was used to characterise the antenatal care pathway in England and the long-term consequences of sensitisation events. The diagnostic accuracy of HT-NIPT was derived from a systematic review and bivariate meta-analysis; estimates of other inputs were derived from relevant literature sources and databases. Women in whom the HT-NIPT was positive or inconclusive continued to receive RAADP, while women with a negative result received none. Five alternative strategies in which the use of HT-NIPT may affect the existing post-partum care pathway were considered. Costs expressed in 2015GBP and impact on health outcomes expressed in terms of quality adjusted life years (QALYs) over a lifetime. The results suggested that HT-NIPT appears cost saving but also less effective than current practice, irrespective of the post-partum strategy evaluated. A post-partum strategy in which inconclusive test results are distinguished from positive results performed best. HT-NIPT is only cost-effective when the overall test cost is £26.60 or less. HT-NIPT would reduce unnecessary treatment with routine anti-D immunoglobulin and is cost saving when compared to current practice. The extent of any savings and cost-effectiveness is sensitive to the overall test cost. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. High-throughput transcriptome analysis of the leafy flower transition of Catharanthus roseus induced by peanut witches'-broom phytoplasma infection.

    Science.gov (United States)

    Liu, Li-Yu Daisy; Tseng, Hsin-I; Lin, Chan-Pin; Lin, Yen-Yu; Huang, Yuan-Hung; Huang, Chien-Kang; Chang, Tean-Hsu; Lin, Shih-Shun

    2014-05-01

    Peanut witches'-broom (PnWB) phytoplasma are obligate bacteria that cause leafy flower symptoms in Catharanthus roseus. The PnWB-mediated leafy flower transitions were studied to understand the mechanisms underlying the pathogen-host interaction; however, our understanding is limited because of the lack of information on the C. roseus genome. In this study, the whole-transcriptome profiles from healthy flowers (HFs) and stage 4 (S4) PnWB-infected leafy flowers of C. roseus were investigated using next-generation sequencing (NGS). More than 60,000 contigs were generated using a de novo assembly approach, and 34.2% of the contigs (20,711 genes) were annotated as putative genes through name-calling, open reading frame determination and gene ontology analyses. Furthermore, a customized microarray based on this sequence information was designed and used to analyze samples further at various stages of PnWB infection. In the NGS profile, 87.8% of the genes showed expression levels that were consistent with those in the microarray profiles, suggesting that accurate gene expression levels can be detected using NGS. The data revealed that defense-related and flowering gene expression levels were altered in S4 PnWB-infected leafy flowers, indicating that the immunity and reproductive stages of C. roseus were compromised. The network analysis suggested that the expression levels of >1,000 candidate genes were highly associated with CrSVP1/2 and CrFT expression, which might be crucial in the leafy flower transition. In conclusion, this study provides a new perspective for understanding plant pathology and the mechanisms underlying the leafy flowering transition caused by host-pathogen interactions through analyzing bioinformatics data obtained using a powerful, rapid high-throughput technique.

  14. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data [version 2; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Damien Correia

    2016-08-01

    Full Text Available The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS or Next-Generation Sequencing (NGS technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS, solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power. Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration

  15. The Danish tax on saturated fat

    DEFF Research Database (Denmark)

    Vallgårda, Signild; Holm, Lotte; Jensen, Jørgen Dejgård

    2015-01-01

    BACKGROUND/OBJECTIVES: Health promoters have repeatedly proposed using economic policy tools, taxes and subsidies, as a means of changing consumer behaviour. As the first country in the world, Denmark introduced a tax on saturated fat in 2011. It was repealed in 2012. In this paper, we present...... on saturated fat had been suggested by two expert committees and was introduced with a majority in parliament, as a part of a larger economic reform package. Many actors, including representatives from the food industry and nutrition researchers, opposed the tax both before and after its introduction, claiming......, research was published showing that consumption of saturated fat had declined in Denmark. CONCLUSIONS: The analysis indicates that the Danish tax on fat was introduced mainly to increase public revenue. As the tax had no strong proponents and many influential adversaries, it was repealed. New research...

  16. Quantitative 1D saturation profiles on chalk by NMR

    DEFF Research Database (Denmark)

    Olsen, Dan; Topp, Simon; Stensgaard, Anders

    1996-01-01

    Quantitative one-dimensional saturation profiles showing the distribution of water and oil in chalk core samples are calculated from NMR measurements utilizing a 1D CSI spectroscopy pulse sequence. Saturation profiles may be acquired under conditions of fluid flow through the sample. Results reveal...... that strong saturation gradients exist in chalk core samples after core floods, due to capillary effects. The method is useful in analysis of corefloods, e.g., for determination of capillary pressure functions...

  17. Brine Distribution after Vacuum Saturation

    DEFF Research Database (Denmark)

    Hedegaard, Kathrine; Andersen, Bertel Lohmann

    1999-01-01

    Experiments with the vacuum saturation method for brine in plugs of chalk showed that a homogeneous distribution of brine cannot be ensured at saturations below 20% volume. Instead of a homogeneous volume distribution the brine becomes concentrated close to the surfaces of the plugs...

  18. Associations between the human intestinal microbiota, Lactobacillus rhamnosus GG and serum lipids indicated by integrated analysis of high-throughput profiling data

    NARCIS (Netherlands)

    Lahti, L.M.; Salonen, A.; Kekkonen, R.A.; Salojärvi, J.; Jalanka-Tuovinen, J.; Palva, A.; Oresic, M.; Vos, de W.M.

    2013-01-01

    Accumulating evidence indicates that the intestinal microbiota regulates our physiology and metabolism. Bacteria marketed as probiotics confer health benefits that may arise from their ability to affect the microbiota. Here high-throughput screening of the intestinal microbiota was carried out and

  19. Analysis of small-sample clinical genomics studies using multi-parameter shrinkage: application to high-throughput RNA interference screening

    NARCIS (Netherlands)

    van de Wiel, M.; Menezes, R.; van Olst, E.; van Beusechem, V.W.

    2013-01-01

    High-throughput (HT) RNA interference (RNAi) screens are increasingly used for reverse genetics and drug discovery. These experiments are laborious and costly, hence sample sizes are often very small. Powerful statistical techniques to detect siRNAs that potentially enhance treatment are currently

  20. New Automated and High-Throughput Quantitative Analysis of Urinary Ketones by Multifiber Exchange-Solid Phase Microextraction Coupled to Fast Gas Chromatography/Negative Chemical-Electron Ionization/Mass Spectrometry

    Science.gov (United States)

    Pacenti, Marco; Dugheri, Stefano; Traldi, Pietro; Degli Esposti, Filippo; Perchiazzi, Nicola; Franchi, Elena; Calamante, Massimo; Kikic, Ireneo; Alessi, Paolo; Bonacchi, Alice; Salvadori, Edoardo; Arcangeli, Giulio; Cupelli, Vincenzo

    2010-01-01

    The present research is focused on automation, miniaturization, and system interaction with high throughput for multiple and specific Direct Immersion-Solid Phase Microextraction/Fast Gas Chromatography analysis of the urinary ketones. The specific Mass Spectrometry instrumentation, capable of supporting such the automated changeover from Negative Chemical to Electron Ionization mode, as well as the automation of the preparation procedure by new device called MultiFiber Exchange, through change of the fibers, allowed a friendly use of mass spectrometry apparatus with a number of advantages including reduced analyst time and greater reproducibility (2.01–5.32%). The detection limits for the seven ketones were less than 0.004 mg/L. For an innovative powerful meaning in high-throughput routine, the generality of the structurally informative Mass Spectrometry fragmentation patterns together with the chromatographic separation and software automation are also investigated. PMID:20628512

  1. Spectrophotometric Analysis of Pigments: A Critical Assessment of a High-Throughput Method for Analysis of Algal Pigment Mixtures by Spectral Deconvolution.

    Directory of Open Access Journals (Sweden)

    Jan-Erik Thrane

    Full Text Available The Gauss-peak spectra (GPS method represents individual pigment spectra as weighted sums of Gaussian functions, and uses these to model absorbance spectra of phytoplankton pigment mixtures. We here present several improvements for this type of methodology, including adaptation to plate reader technology and efficient model fitting by open source software. We use a one-step modeling of both pigment absorption and background attenuation with non-negative least squares, following a one-time instrument-specific calibration. The fitted background is shown to be higher than a solvent blank, with features reflecting contributions from both scatter and non-pigment absorption. We assessed pigment aliasing due to absorption spectra similarity by Monte Carlo simulation, and used this information to select a robust set of identifiable pigments that are also expected to be common in natural samples. To test the method's performance, we analyzed absorbance spectra of pigment extracts from sediment cores, 75 natural lake samples, and four phytoplankton cultures, and compared the estimated pigment concentrations with concentrations obtained using high performance liquid chromatography (HPLC. The deviance between observed and fitted spectra was generally very low, indicating that measured spectra could successfully be reconstructed as weighted sums of pigment and background components. Concentrations of total chlorophylls and total carotenoids could accurately be estimated for both sediment and lake samples, but individual pigment concentrations (especially carotenoids proved difficult to resolve due to similarity between their absorbance spectra. In general, our modified-GPS method provides an improvement of the GPS method that is a fast, inexpensive, and high-throughput alternative for screening of pigment composition in samples of phytoplankton material.

  2. Initial evaluation of protein throughput and yield characteristics on nylon 6 capillary-channeled polymer (C-CP) fiber stationary phases by frontal analysis.

    Science.gov (United States)

    Randunu, K Manoj; Marcus, R Kenneth

    2013-01-01

    Nylon 6 capillary-channeled polymer (C-CP) fibers are investigated as an alternative support/stationary phase for downstream processing of macromolecules. Ionizable amine and carboxylic acid end groups on the native fiber surface allow for ion exchange chromatography (IEC). The low cost and ability to operate at high linear velocities and low back pressures are practical advantages of C-CP fibers for preparative-scale macromolecule separations. The lack of fiber porosity ensures facile adsorption/desorption that is conducive to high throughput and recoveries/yields. Described here is a preliminary investigation of the processing characteristics of lysozyme on nylon 6 fibers with an eye toward downstream processing applications. Fibers were packed into microbore (0.8 mm i.d.) and analytical-size (2.1 mm i.d.) columns for the evaluation of the role of linear velocity on pressure drop, frontal throughput, and yield. Protein isolation by frontal development involved three steps: loading of the column to breakthrough, an aqueous wash, and a salt wash to recover the protein. Frontal throughput was evaluated with different salt concentrations (0-1000 mM NaCl) and different linear velocities (6-24 mm s(-1)). The observed throughput values are in the range of 0.12-0.20 mg min(-1) when 0.25 mg mL(-1) lysozyme (in 20 mM Tris-HCl) is loaded onto 78 mg of C-CP fiber in 0.52 mL volume analytical columns. Increased throughput and yield were found when protein was loaded and eluted at high linear velocity. Results of this study lend credence to the further development of C-CP fibers for biomacromolecule processing on larger scales. © 2013 American Institute of Chemical Engineers.

  3. Effects of Saturated Fat, Polyunsaturated Fat, Monounsaturated Fat, and Carbohydrate on Glucose-Insulin Homeostasis: A Systematic Review and Meta-analysis of Randomised Controlled Feeding Trials

    Science.gov (United States)

    Micha, Renata; Wu, Jason H. Y.; de Oliveira Otto, Marcia C.; Mozaffarian, Dariush

    2016-01-01

    Background Effects of major dietary macronutrients on glucose-insulin homeostasis remain controversial and may vary by the clinical measures examined. We aimed to assess how saturated fat (SFA), monounsaturated fat (MUFA), polyunsaturated fat (PUFA), and carbohydrate affect key metrics of glucose-insulin homeostasis. Methods and Findings We systematically searched multiple databases (PubMed, EMBASE, OVID, BIOSIS, Web-of-Knowledge, CAB, CINAHL, Cochrane Library, SIGLE, Faculty1000) for randomised controlled feeding trials published by 26 Nov 2015 that tested effects of macronutrient intake on blood glucose, insulin, HbA1c, insulin sensitivity, and insulin secretion in adults aged ≥18 years. We excluded trials with non-isocaloric comparisons and trials providing dietary advice or supplements rather than meals. Studies were reviewed and data extracted independently in duplicate. Among 6,124 abstracts, 102 trials, including 239 diet arms and 4,220 adults, met eligibility requirements. Using multiple-treatment meta-regression, we estimated dose-response effects of isocaloric replacements between SFA, MUFA, PUFA, and carbohydrate, adjusted for protein, trans fat, and dietary fibre. Replacing 5% energy from carbohydrate with SFA had no significant effect on fasting glucose (+0.02 mmol/L, 95% CI = -0.01, +0.04; n trials = 99), but lowered fasting insulin (-1.1 pmol/L; -1.7, -0.5; n = 90). Replacing carbohydrate with MUFA lowered HbA1c (-0.09%; -0.12, -0.05; n = 23), 2 h post-challenge insulin (-20.3 pmol/L; -32.2, -8.4; n = 11), and homeostasis model assessment for insulin resistance (HOMA-IR) (-2.4%; -4.6, -0.3; n = 30). Replacing carbohydrate with PUFA significantly lowered HbA1c (-0.11%; -0.17, -0.05) and fasting insulin (-1.6 pmol/L; -2.8, -0.4). Replacing SFA with PUFA significantly lowered glucose, HbA1c, C-peptide, and HOMA. Based on gold-standard acute insulin response in ten trials, PUFA significantly improved insulin secretion capacity (+0.5 pmol/L/min; 0.2, 0

  4. Effects of Saturated Fat, Polyunsaturated Fat, Monounsaturated Fat, and Carbohydrate on Glucose-Insulin Homeostasis: A Systematic Review and Meta-analysis of Randomised Controlled Feeding Trials.

    Science.gov (United States)

    Imamura, Fumiaki; Micha, Renata; Wu, Jason H Y; de Oliveira Otto, Marcia C; Otite, Fadar O; Abioye, Ajibola I; Mozaffarian, Dariush

    2016-07-01

    Effects of major dietary macronutrients on glucose-insulin homeostasis remain controversial and may vary by the clinical measures examined. We aimed to assess how saturated fat (SFA), monounsaturated fat (MUFA), polyunsaturated fat (PUFA), and carbohydrate affect key metrics of glucose-insulin homeostasis. We systematically searched multiple databases (PubMed, EMBASE, OVID, BIOSIS, Web-of-Knowledge, CAB, CINAHL, Cochrane Library, SIGLE, Faculty1000) for randomised controlled feeding trials published by 26 Nov 2015 that tested effects of macronutrient intake on blood glucose, insulin, HbA1c, insulin sensitivity, and insulin secretion in adults aged ≥18 years. We excluded trials with non-isocaloric comparisons and trials providing dietary advice or supplements rather than meals. Studies were reviewed and data extracted independently in duplicate. Among 6,124 abstracts, 102 trials, including 239 diet arms and 4,220 adults, met eligibility requirements. Using multiple-treatment meta-regression, we estimated dose-response effects of isocaloric replacements between SFA, MUFA, PUFA, and carbohydrate, adjusted for protein, trans fat, and dietary fibre. Replacing 5% energy from carbohydrate with SFA had no significant effect on fasting glucose (+0.02 mmol/L, 95% CI = -0.01, +0.04; n trials = 99), but lowered fasting insulin (-1.1 pmol/L; -1.7, -0.5; n = 90). Replacing carbohydrate with MUFA lowered HbA1c (-0.09%; -0.12, -0.05; n = 23), 2 h post-challenge insulin (-20.3 pmol/L; -32.2, -8.4; n = 11), and homeostasis model assessment for insulin resistance (HOMA-IR) (-2.4%; -4.6, -0.3; n = 30). Replacing carbohydrate with PUFA significantly lowered HbA1c (-0.11%; -0.17, -0.05) and fasting insulin (-1.6 pmol/L; -2.8, -0.4). Replacing SFA with PUFA significantly lowered glucose, HbA1c, C-peptide, and HOMA. Based on gold-standard acute insulin response in ten trials, PUFA significantly improved insulin secretion capacity (+0.5 pmol/L/min; 0.2, 0.8) whether replacing

  5. Effects of Saturated Fat, Polyunsaturated Fat, Monounsaturated Fat, and Carbohydrate on Glucose-Insulin Homeostasis: A Systematic Review and Meta-analysis of Randomised Controlled Feeding Trials.

    Directory of Open Access Journals (Sweden)

    Fumiaki Imamura

    2016-07-01

    Full Text Available Effects of major dietary macronutrients on glucose-insulin homeostasis remain controversial and may vary by the clinical measures examined. We aimed to assess how saturated fat (SFA, monounsaturated fat (MUFA, polyunsaturated fat (PUFA, and carbohydrate affect key metrics of glucose-insulin homeostasis.We systematically searched multiple databases (PubMed, EMBASE, OVID, BIOSIS, Web-of-Knowledge, CAB, CINAHL, Cochrane Library, SIGLE, Faculty1000 for randomised controlled feeding trials published by 26 Nov 2015 that tested effects of macronutrient intake on blood glucose, insulin, HbA1c, insulin sensitivity, and insulin secretion in adults aged ≥18 years. We excluded trials with non-isocaloric comparisons and trials providing dietary advice or supplements rather than meals. Studies were reviewed and data extracted independently in duplicate. Among 6,124 abstracts, 102 trials, including 239 diet arms and 4,220 adults, met eligibility requirements. Using multiple-treatment meta-regression, we estimated dose-response effects of isocaloric replacements between SFA, MUFA, PUFA, and carbohydrate, adjusted for protein, trans fat, and dietary fibre. Replacing 5% energy from carbohydrate with SFA had no significant effect on fasting glucose (+0.02 mmol/L, 95% CI = -0.01, +0.04; n trials = 99, but lowered fasting insulin (-1.1 pmol/L; -1.7, -0.5; n = 90. Replacing carbohydrate with MUFA lowered HbA1c (-0.09%; -0.12, -0.05; n = 23, 2 h post-challenge insulin (-20.3 pmol/L; -32.2, -8.4; n = 11, and homeostasis model assessment for insulin resistance (HOMA-IR (-2.4%; -4.6, -0.3; n = 30. Replacing carbohydrate with PUFA significantly lowered HbA1c (-0.11%; -0.17, -0.05 and fasting insulin (-1.6 pmol/L; -2.8, -0.4. Replacing SFA with PUFA significantly lowered glucose, HbA1c, C-peptide, and HOMA. Based on gold-standard acute insulin response in ten trials, PUFA significantly improved insulin secretion capacity (+0.5 pmol/L/min; 0.2, 0.8 whether replacing

  6. Sources of excessive saturated fat, trans fat and sugar consumption in Brazil: an analysis of the first Brazilian nationwide individual dietary survey.

    Science.gov (United States)

    Pereira, Rosangela A; Duffey, Kiyah J; Sichieri, Rosely; Popkin, Barry M

    2014-01-01

    To examine the patterns of consumption of foods high in solid fats and added sugars (SoFAS) in Brazil. Cross-sectional study; individual dietary intake survey. Food intake was assessed by means of two non-consecutive food records. Foods providing >9·1% of energy from saturated fat, or >1·3% of energy from trans fat, or >13% of energy from added sugars per 100 g were classified as high in SoFAS. Brazilian nationwide survey, 2008-2009. Individuals aged ≥10 years old. Mean daily energy intake was 8037 kJ (1921 kcal), 52% of energy came from SoFAS foods. Contribution of SoFAS foods to total energy intake was higher among women (52%) and adolescents (54%). Participants in rural areas (43%) and in the lowest quartile of per capita family income (43%) reported the smallest contribution of SoFAS foods to total energy intake. SoFAS foods were large contributors to total saturated fat (87%), trans fat (89%), added sugar (98%) and total sugar (96%) consumption. The SoFAS food groups that contributed most to total energy intake were meats and beverages. Top SoFAS foods contributing to saturated fat and trans fat intakes were meats and fats and oils. Most of the added and total sugar in the diet was supplied by SoFAS beverages and sweets and desserts. SoFAS foods play an important role in the Brazilian diet. The study identifies options for improving the Brazilian diet and reducing nutrition-related non-communicable chronic diseases, but also points out some limitations of the nutrient-based criteria.

  7. WAter Saturation Shift Referencing (WASSR) for chemical exchange saturation transfer experiments

    Science.gov (United States)

    Kim, Mina; Gillen, Joseph; Landman, Bennett. A.; Zhou, Jinyuan; van Zijl, Peter C.M.

    2010-01-01

    Chemical exchange saturation transfer (CEST) is a contrast mechanism exploiting exchange-based magnetization transfer (MT) between solute and water protons. CEST effects compete with direct water saturation and conventional MT processes and generally can only be quantified through an asymmetry analysis of the water saturation spectrum (Z-spectrum) with respect to the water frequency, a process that is exquisitely sensitive to magnetic field inhomogeneities. Here, it is shown that direct water saturation imaging allows measurement of the absolute water frequency in each voxel, allowing proper centering of Z-spectra on a voxel-by-voxel basis independent of spatial B0 field variations. Optimal acquisition parameters for this “water saturation shift referencing” or “WASSR” approach were estimated using Monte Carlo simulations and later confirmed experimentally. The optimal ratio of the WASSR sweep width to the linewidth of the direct saturation curve was found to be 3.3–4.0, requiring a sampling of 16–32 points. The frequency error was smaller than 1 Hz at signal to noise ratios of 40 or higher. The WASSR method was applied to study glycogen, where the chemical shift difference between the hydroxyl (OH) protons and bulk water protons at 3T is so small (0.75–1.25 ppm) that the CEST spectrum is inconclusive without proper referencing. PMID:19358232

  8. Water saturation shift referencing (WASSR) for chemical exchange saturation transfer (CEST) experiments.

    Science.gov (United States)

    Kim, Mina; Gillen, Joseph; Landman, Bennett A; Zhou, Jinyuan; van Zijl, Peter C M

    2009-06-01

    Chemical exchange saturation transfer (CEST) is a contrast mechanism that exploits exchange-based magnetization transfer (MT) between solute and water protons. CEST effects compete with direct water saturation and conventional MT processes, and generally can only be quantified through an asymmetry analysis of the water saturation spectrum (Z-spectrum) with respect to the water frequency, a process that is exquisitely sensitive to magnetic field inhomogeneities. Here it is shown that direct water saturation imaging allows measurement of the absolute water frequency in each voxel, allowing proper centering of Z-spectra on a voxel-by-voxel basis independently of spatial B(0) field variations. Optimal acquisition parameters for this "water saturation shift referencing" (WASSR) approach were estimated using Monte Carlo simulations and later confirmed experimentally. The optimal ratio of the WASSR sweep width to the linewidth of the direct saturation curve was found to be 3.3-4.0, requiring a sampling of 16-32 points. The frequency error was smaller than 1 Hz at signal-to-noise ratios of 40 or higher. The WASSR method was applied to study glycogen, where the chemical shift difference between the hydroxyl (OH) protons and bulk water protons at 3T is so small (0.75-1.25 ppm) that the CEST spectrum is inconclusive without proper referencing.

  9. Comparison of pulseoximetry oxygen saturation and arterial oxygen saturation in open heart intensive care unit

    Directory of Open Access Journals (Sweden)

    Alireza Mahoori

    2013-08-01

    Full Text Available Background: Pulseoximetry is widely used in the critical care setting, currently used to guide therapeutic interventions. Few studies have evaluated the accuracy of SPO2 (puls-eoximetry oxygen saturation in intensive care unit after cardiac surgery. Our objective was to compare pulseoximetry with arterial oxygen saturation (SaO2 during clinical routine in such patients, and to examine the effect of mild acidosis on this relationship.Methods: In an observational prospective study 80 patients were evaluated in intensive care unit after cardiac surgery. SPO2 was recorded and compared with SaO2 obtained by blood gas analysis. One or serial arterial blood gas analyses (ABGs were performed via a radial artery line while a reliable pulseoximeter signal was present. One hundred thirty seven samples were collected and for each blood gas analyses, SaO2 and SPO2 we recorded.Results: O2 saturation as a marker of peripheral perfusion was measured by Pulseoxim-etry (SPO2. The mean difference between arterial oxygen saturation and pulseoximetry oxygen saturation was 0.12%±1.6%. A total of 137 paired readings demonstrated good correlation (r=0.754; P<0.0001 between changes in SPO2 and those in SaO2 in samples with normal hemoglobin. Also in forty seven samples with mild acidosis, paired readings demonstrated good correlation (r=0.799; P<0.0001 and the mean difference between SaO2 and SPO2 was 0.05%±1.5%.Conclusion: Data showed that in patients with stable hemodynamic and good signal quality, changes in pulseoximetry oxygen saturation reliably predict equivalent changes in arterial oxygen saturation. Mild acidosis doesn’t alter the relation between SPO2 and SaO2 to any clinically important extent. In conclusion, the pulse oximeter is useful to monitor oxygen saturation in patients with stable hemodynamic.

  10. Cerebral time domain-NIRS: reproducibility analysis, optical properties, hemoglobin species and tissue oxygen saturation in a cohort of adult subjects

    OpenAIRE

    Giacalone, Giacomo; Zanoletti, Marta; Contini, Davide; Rebecca, Re; Spinelli, Lorenzo; Roveri, Luisa; Torricelli, Alessandro

    2017-01-01

    The reproducibility of cerebral time-domain near-infrared spectroscopy (TD-NIRS) has not been investigated so far. Besides, reference intervals of cerebral optical properties, of absolute concentrations of deoxygenated-hemoglobin (HbR), oxygenated-hemoglobin (HbO), total hemoglobin (HbT) and tissue oxygen saturation (StO2) and their variability have not been reported. We have addressed these issues on a sample of 88 adult healthy subjects. TD-NIRS measurements at 690, 785, 830 nm were fitted ...

  11. Saturation current spikes eliminated in saturable core transformers

    Science.gov (United States)

    Schwarz, F. C.

    1971-01-01

    Unsaturating composite magnetic core transformer, consisting of two separate parallel cores designed so impending core saturation causes signal generation, terminates high current spike in converter primary circuit. Simplified waveform, demonstrates transformer effectiveness in eliminating current spikes.

  12. Numerical analysis of the Balitsky-Kovchegov equation with running coupling: dependence of the saturation scale on nuclear size and rapidity

    CERN Document Server

    Albacete, J L; Milhano, J G; Salgado, C A; Wiedemann, Urs Achim

    2005-01-01

    We study the effects of including a running coupling constant in high-density QCD evolution. For fixed coupling constant, QCD evolution preserves the initial dependence of the saturation momentum $Q_s$ on the nuclear size $A$ and results in an exponential dependence on rapidity $Y$, $Q^2_s(Y) = Q^2_s(Y_0) \\exp{[ \\bar\\alpha_s d (Y-Y_0) ]}$. For the running coupling case, we re-derive analytical estimates for the $A$- and $Y$-dependences of the saturation scale and test them numerically. The $A$-dependence of $Q_s$ vanishes $\\propto 1/ \\sqrt{Y}$ for large $A$ and $Y$. The $Y$-dependence is reduced to $Q_s^2(Y) \\propto \\exp{(\\Delta^\\prime\\sqrt{Y+X})}$ where we find numerically $\\Delta^\\prime\\simeq 3.2$, approximately 12% smaller than analytical estimates. In contrast to previous analytical work, we find a marked difference between the anomalous dimension $1-\\gamma$ governing the large transverse momentum behaviour of the gluon distribution for fixed coupling ($\\gamma \\simeq 0.65$) and for running coupling ($\\gam...

  13. Investigation of DNA damage response and apoptotic gene methylation pattern in sporadic breast tumors using high throughput quantitative DNA methylation analysis technology

    Directory of Open Access Journals (Sweden)

    Prakash Neeraj

    2010-11-01

    Full Text Available Abstract Background- Sporadic breast cancer like many other cancers is proposed to be a manifestation of abnormal genetic and epigenetic changes. For the past decade our laboratory has identified genes involved in DNA damage response (DDR, apoptosis and immunesurvelliance pathways to influence sporadic breast cancer risk in north Indian population. Further to enhance our knowledge at the epigenetic level, we performed DNA methylation study involving 17 gene promoter regions belonging to DNA damage response (DDR and death receptor apoptotic pathway in 162 paired normal and cancerous breast tissues from 81 sporadic breast cancer patients, using a high throughput quantitative DNA methylation analysis technology. Results- The study identified five genes with statistically significant difference between normal and tumor tissues. Hypermethylation of DR5 (P = 0.001, DCR1 (P = 0.00001, DCR2 (P = 0.0000000005 and BRCA2 (P = 0.007 and hypomethylation of DR4 (P = 0.011 in sporadic breast tumor tissues suggested a weak/aberrant activation of the DDR/apoptotic pathway in breast tumorigenesis. Negative correlation was observed between methylation status and transcript expression levels for TRAIL, DR4, CASP8, ATM, CHEK2, BRCA1 and BRCA2 CpG sites. Categorization of the gene methylation with respect to the clinicopathological parameters showed an increase in aberrant methylation pattern in advanced tumors. These uncharacteristic methylation patterns corresponded with decreased death receptor apoptosis (P = 0.047 and DNA damage repair potential (P = 0.004 in advanced tumors. The observation of BRCA2 -26 G/A 5'UTR polymorphism concomitant with the presence of methylation in the promoter region was novel and emerged as a strong candidate for susceptibility to sporadic breast tumors. Conclusion- Our study indicates that methylation of DDR-apoptotic gene promoters in sporadic breast cancer is not a random phenomenon. Progressive epigenetic alterations in advancing

  14. Intake of saturated and trans unsaturated fatty acids and risk of all cause mortality, cardiovascular disease, and type 2 diabetes: systematic review and meta-analysis of observational studies.

    Science.gov (United States)

    de Souza, Russell J; Mente, Andrew; Maroleanu, Adriana; Cozma, Adrian I; Ha, Vanessa; Kishibe, Teruko; Uleryk, Elizabeth; Budylowski, Patrick; Schünemann, Holger; Beyene, Joseph; Anand, Sonia S

    2015-08-11

    To systematically review associations between intake of saturated fat and trans unsaturated fat and all cause mortality, cardiovascular disease (CVD) and associated mortality, coronary heart disease (CHD) and associated mortality, ischemic stroke, and type 2 diabetes. Systematic review and meta-analysis. Medline, Embase, Cochrane Central Registry of Controlled Trials, Evidence-Based Medicine Reviews, and CINAHL from inception to 1 May 2015, supplemented by bibliographies of retrieved articles and previous reviews. Observational studies reporting associations of saturated fat and/or trans unsaturated fat (total, industrially manufactured, or from ruminant animals) with all cause mortality, CHD/CVD mortality, total CHD, ischemic stroke, or type 2 diabetes. Two reviewers independently extracted data and assessed study risks of bias. Multivariable relative risks were pooled. Heterogeneity was assessed and quantified. Potential publication bias was assessed and subgroup analyses were undertaken. The GRADE approach was used to evaluate quality of evidence and certainty of conclusions. For saturated fat, three to 12 prospective cohort studies for each association were pooled (five to 17 comparisons with 90,501-339,090 participants). Saturated fat intake was not associated with all cause mortality (relative risk 0.99, 95% confidence interval 0.91 to 1.09), CVD mortality (0.97, 0.84 to 1.12), total CHD (1.06, 0.95 to 1.17), ischemic stroke (1.02, 0.90 to 1.15), or type 2 diabetes (0.95, 0.88 to 1.03). There was no convincing lack of association between saturated fat and CHD mortality (1.15, 0.97 to 1.36; P=0.10). For trans fats, one to six prospective cohort studies for each association were pooled (two to seven comparisons with 12,942-230,135 participants). Total trans fat intake was associated with all cause mortality (1.34, 1.16 to 1.56), CHD mortality (1.28, 1.09 to 1.50), and total CHD (1.21, 1.10 to 1.33) but not ischemic stroke (1.07, 0.88 to 1.28) or type 2 diabetes

  15. Experimental Characterization Of The Saturating, Near Infrared, Self-amplified Spontaneous Emission Free Electron Laser Analysis Of Radiation Properties And Electron Beam Dynamics

    CERN Document Server

    Murokh, A

    2002-01-01

    In this work, the main results of the VISA experiment (Visible to Infrared SASE Amplifier) are presented and analyzed. The purpose of the experiment was to build a state-of-the-art single pass self-amplified spontaneous emission (SASE) free electron laser (FEL) based on a high brightness electron beam, and characterize its operation, including saturation, in the near infrared spectral region. This experiment was hosted by Accelerator Test Facility (ATF) at Brookhaven National Laboratory, which is a users facility that provides high brightness relativistic electron beams generated with the photoinjector. During the experiment, SASE FEL performance was studied in two regimes: a long bunch, lower gain operation; and a short bunch high gain regime. The transition between the two conditions was possible due to a novel bunch compression mechanism, which was discovered in the course of the experiment. This compression allowed the variation of peak current in the electron beam before it was launched into the 4-m VISA...

  16. Group epitope mapping considering relaxation of the ligand (GEM-CRL): Including longitudinal relaxation rates in the analysis of saturation transfer difference (STD) experiments

    Science.gov (United States)

    Kemper, Sebastian; Patel, Mitul K.; Errey, James C.; Davis, Benjamin G.; Jones, Jonathan A.; Claridge, Timothy D. W.

    2010-03-01

    In the application of saturation transfer difference (STD) experiments to the study of protein-ligand interactions, the relaxation of the ligand is one of the major influences on the experimentally observed STD factors, making interpretation of these difficult when attempting to define a group epitope map (GEM). In this paper, we describe a simplification of the relaxation matrix that may be applied under specified experimental conditions, which results in a simplified equation reflecting the directly transferred magnetisation rate from the protein onto the ligand, defined as the summation over the whole protein of the protein-ligand cross-relaxation multiplied by with the fractional saturation of the protein protons. In this, the relaxation of the ligand is accounted for implicitly by inclusion of the experimentally determined longitudinal relaxation rates. The conditions under which this "group epitope mapping considering relaxation of the ligand" (GEM-CRL) can be applied were tested on a theoretical model system, which demonstrated only minor deviations from that predicted by the full relaxation matrix calculations (CORCEMA-ST) [7]. Furthermore, CORCEMA-ST calculations of two protein-saccharide complexes (Jacalin and TreR) with known crystal structures were performed and compared with experimental GEM-CRL data. It could be shown that the GEM-CRL methodology is superior to the classical group epitope mapping approach currently used for defining ligand-protein proximities. GEM-CRL is also useful for the interpretation of CORCEMA-ST results, because the transferred magnetisation rate provides an additional parameter for the comparison between measured and calculated values. The independence of this parameter from the above mentioned factors can thereby enhance the value of CORCEMA-ST calculations.

  17. The danish tax on saturated fat

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Smed, Sinne

    Denmark introduced a new tax on saturated fat in food products with effect from October 2011. The objective of this paper is to make an effect assessment of this tax for some of the product categories most significantly affected by the new tax, namely fats such as butter, butter-blends, margarine...... on saturated fat in food products has had some effects on the market for the considered products, in that the level of consumption of fats dropped by 10 – 20%. Furthermore, the analysis points at shifts in demand from high-price supermarkets towards low-price discount stores – a shift that seems to have been...... – and broaden – the analysis at a later stage, when data are available for a longer period after the introduction of the fat tax....

  18. High-throughput sequencing and analysis of the gill tissue transcriptome from the deep-sea hydrothermal vent mussel Bathymodiolus azoricus

    Directory of Open Access Journals (Sweden)

    Gomes Paula

    2010-10-01

    Full Text Available Abstract Background Bathymodiolus azoricus is a deep-sea hydrothermal vent mussel found in association with large faunal communities living in chemosynthetic environments at the bottom of the sea floor near the Azores Islands. Investigation of the exceptional physiological reactions that vent mussels have adopted in their habitat, including responses to environmental microbes, remains a difficult challenge for deep-sea biologists. In an attempt to reveal genes potentially involved in the deep-sea mussel innate immunity we carried out a high-throughput sequence analysis of freshly collected B. azoricus transcriptome using gills tissues as the primary source of immune transcripts given its strategic role in filtering the surrounding waterborne potentially infectious microorganisms. Additionally, a substantial EST data set was produced and from which a comprehensive collection of genes coding for putative proteins was organized in a dedicated database, "DeepSeaVent" the first deep-sea vent animal transcriptome database based on the 454 pyrosequencing technology. Results A normalized cDNA library from gills tissue was sequenced in a full 454 GS-FLX run, producing 778,996 sequencing reads. Assembly of the high quality reads resulted in 75,407 contigs of which 3,071 were singletons. A total of 39,425 transcripts were conceptually translated into amino-sequences of which 22,023 matched known proteins in the NCBI non-redundant protein database, 15,839 revealed conserved protein domains through InterPro functional classification and 9,584 were assigned with Gene Ontology terms. Queries conducted within the database enabled the identification of genes putatively involved in immune and inflammatory reactions which had not been previously evidenced in the vent mussel. Their physical counterpart was confirmed by semi-quantitative quantitative Reverse-Transcription-Polymerase Chain Reactions (RT-PCR and their RNA transcription level by quantitative PCR (q

  19. Theory of graphene saturable absorption

    Science.gov (United States)

    Marini, A.; Cox, J. D.; García de Abajo, F. J.

    2017-03-01

    Saturable absorption is a nonperturbative nonlinear optical phenomenon that plays a pivotal role in the generation of ultrafast light pulses. Here we show that this effect emerges in graphene at unprecedentedly low light intensities, thus opening avenues to new nonlinear physics and applications in optical technology. Specifically, we theoretically investigate saturable absorption in extended graphene by developing a semianalytical nonperturbative single-particle approach, describing electron dynamics in the atomically-thin material using the two-dimensional Dirac equation for massless Dirac fermions, which is recast in the form of generalized Bloch equations. By solving the electron dynamics nonperturbatively, we account for both interband and intraband contributions to the intensity-dependent saturated conductivity and conclude that the former dominates regardless of the intrinsic doping state of the material. We obtain results in qualitative agreement with atomistic quantum-mechanical simulations of graphene nanoribbons including electron-electron interactions, finite-size, and higher-band effects. Remarkably, such effects are found to affect mainly the linear absorption, while the predicted saturation intensities are in good quantitative agreement in the limit of extended graphene. Additionally, we find that the modulation depth of saturable absorption in graphene can be electrically manipulated through an externally applied gate voltage. Our results are relevant for the development of graphene-based optoelectronic devices, as well as for applications in mode-locking and random lasers.

  20. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  1. Enhancing analysis throughput, sensitivity and specificity in LC/ESI-MS/MS assay of plasma 25-hydroxyvitamin D3 by derivatization with triplex 4-(4-dimethylaminophenyl)-1,2,4-triazoline-3,5-dione (DAPTAD) isotopologues.

    Science.gov (United States)

    Ogawa, Shoujiro; Kittaka, Hiroki; Nakata, Akiho; Komatsu, Kenji; Sugiura, Takahiro; Satoh, Mamoru; Nomura, Fumio; Higashi, Tatsuya

    2017-03-20

    The plasma/serum concentration of 25-hydroxyvitamin D3 [25(OH)D3] is a diagnostic index for vitamin D deficiency/insufficiency, which is associated with a wide range of diseases, such as rickets, cancer and diabetes. We have reported that the derivatization with 4-(4-dimethylaminophenyl)-1,2,4-triazoline-3,5-dione (DAPTAD) works well in the liquid chromatography/electrospray ionization-tandem mass spectrometry (LC/ESI-MS/MS) assay of the serum/plasma 25(OH)D3 for enhancing the sensitivity and the separation from a potent interfering metabolite, 3-epi-25-hydroxyvitamin D3 [3-epi-25(OH)D3]. However, enhancing the analysis throughput remains an issue in the LC/ESI-MS/MS assay of 25(OH)D3. The most obvious restriction of the LC/MS/MS throughput is the chromatographic run time. In this study, we developed an enhanced throughput method for the determination of the plasma 25(OH)D3 by LC/ESI-MS/MS combined with the derivatization using the triplex (2H0-, 2H3- and 2H6-) DAPTAD isotopologues. After separate derivatization with 1 of 3 different isotopologues, the 3 samples were combined and injected together into LC/ESI-MS/MS. Based on the mass differences between the isotopologues, the derivatized 25(OH)D3 in the 3 different samples were quantified within a single run. The developed method tripled the hourly analysis throughput without sacrificing assay performance, i.e., ease of pretreatment of plasma sample (only deproteinization), limit of quantification (1.0ng/mL when a 5μL-plasma was used), precision (intra-assay RSD≤5.9% and inter-assay RSD≤5.5%), accuracy (98.7-102.2%), matrix effects, and capability of separating from an interfering metabolite, 3-epi-25(OH)D3. The multiplexing of samples by the isotopologue derivatization was applied to the analysis of plasma samples of healthy subjects and the developed method was proven to have a satisfactory applicability. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Saturated Zone In-Situ Testing

    Energy Technology Data Exchange (ETDEWEB)

    P. W. Reimus; M. J. Umari

    2003-12-23

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that have been conducted to test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain. The test interpretations provide estimates of flow and transport parameters that are used in the development of parameter distributions for Total System Performance Assessment (TSPA) calculations. These parameter distributions are documented in the revisions to the SZ flow model report (BSC 2003 [ 162649]), the SZ transport model report (BSC 2003 [ 162419]), the SZ colloid transport report (BSC 2003 [162729]), and the SZ transport model abstraction report (BSC 2003 [1648701]). Specifically, this scientific analysis report provides the following information that contributes to the assessment of the capability of the SZ to serve as a barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvium Testing Complex (ATC), which is located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and

  3. Cerebral time domain-NIRS: reproducibility analysis, optical properties, hemoglobin species and tissue oxygen saturation in a cohort of adult subjects.

    Science.gov (United States)

    Giacalone, Giacomo; Zanoletti, Marta; Contini, Davide; Re, Rebecca; Spinelli, Lorenzo; Roveri, Luisa; Torricelli, Alessandro

    2017-11-01

    The reproducibility of cerebral time-domain near-infrared spectroscopy (TD-NIRS) has not been investigated so far. Besides, reference intervals of cerebral optical properties, of absolute concentrations of deoxygenated-hemoglobin (HbR), oxygenated-hemoglobin (HbO), total hemoglobin (HbT) and tissue oxygen saturation (StO2) and their variability have not been reported. We have addressed these issues on a sample of 88 adult healthy subjects. TD-NIRS measurements at 690, 785, 830 nm were fitted with the diffusion model for semi-infinite homogenous media. Reproducibility, performed on 3 measurements at 5 minutes intervals, ranges from 1.8 to 6.9% for each of the hemoglobin species. The mean ± SD global values of HbR, HbO, HbT, StO2 are respectively 24 ± 7 μM, 33.3 ± 9.5 μM, 57.4 ± 15.8 μM, 58 ± 4.2%. StO2 displays the narrowest range of variability across brain regions.

  4. Meta-analysis of field-saturated hydraulic conductivity recovery following wildland fire: Applications for hydrologic model parameterization and resilience assessment

    Science.gov (United States)

    Ebel, Brian A.; Martin, Deborah

    2017-01-01

    Hydrologic recovery after wildfire is critical for restoring the ecosystem services of protecting of human lives and infrastructure from hazards and delivering water supply of sufficient quality and quantity. Recovery of soil-hydraulic properties, such as field-saturated hydraulic conductivity (Kfs), is a key factor for assessing the duration of watershed-scale flash flood and debris flow risks after wildfire. Despite the crucial role of Kfs in parameterizing numerical hydrologic models to predict the magnitude of postwildfire run-off and erosion, existing quantitative relations to predict Kfsrecovery with time since wildfire are lacking. Here, we conduct meta-analyses of 5 datasets from the literature that measure or estimate Kfs with time since wildfire for longer than 3-year duration. The meta-analyses focus on fitting 2 quantitative relations (linear and non-linear logistic) to explain trends in Kfs temporal recovery. The 2 relations adequately described temporal recovery except for 1 site where macropore flow dominated infiltration and Kfs recovery. This work also suggests that Kfs can have low hydrologic resistance (large postfire changes), and moderate to high hydrologic stability (recovery time relative to disturbance recurrence interval) and resilience (recovery of hydrologic function and provision of ecosystem services). Future Kfs relations could more explicitly incorporate processes such as soil-water repellency, ground cover and soil structure regeneration, macropore recovery, and vegetation regrowth.

  5. Start-up and saturation in self-amplified spontaneous emission free-electron lasers using a time-independent analysis.

    Science.gov (United States)

    Kumar, Vinit; Krishnagopal, Srinivas

    2002-01-01

    Numerical simulation of self-amplified spontaneous emission (SASE) in free-electron lasers (FELs) is typically performed using time-dependent computer codes, which take large CPU time and require large memory. Recently, Yu [Phys. Rev. E 58, 4991 (1998)] has shown that one can even use a time-independent code for this purpose (where the requirement on CPU time and memory is significantly reduced) by modifying it to include multiple phase-space buckets and using a scaling relation between the output power and the number of simulation particles, which is valid only in the linear regime. In this paper, we take a fresh look at the problem and show that incorporating multiple buckets in TDA3D is not needed to simulate the SASE process. We give a new interpretation of time-independent simulations of the SASE process and present detailed justification for using a single-frequency steady-state simulation code for the study of evolution of shot noise. We further extend the simulation studies to the nonlinear regime by modifying the code TDA3D to take the incoherent input power. We use this technique to study the start-up and saturation of the TTF-II FEL at DESY and discuss the results.

  6. Cerebral time domain-NIRS: reproducibility analysis, optical properties, hemoglobin species and tissue oxygen saturation in a cohort of adult subjects

    Science.gov (United States)

    Giacalone, Giacomo; Zanoletti, Marta; Contini, Davide; Re, Rebecca; Spinelli, Lorenzo; Roveri, Luisa; Torricelli, Alessandro

    2017-01-01

    The reproducibility of cerebral time-domain near-infrared spectroscopy (TD-NIRS) has not been investigated so far. Besides, reference intervals of cerebral optical properties, of absolute concentrations of deoxygenated-hemoglobin (HbR), oxygenated-hemoglobin (HbO), total hemoglobin (HbT) and tissue oxygen saturation (StO2) and their variability have not been reported. We have addressed these issues on a sample of 88 adult healthy subjects. TD-NIRS measurements at 690, 785, 830 nm were fitted with the diffusion model for semi-infinite homogenous media. Reproducibility, performed on 3 measurements at 5 minutes intervals, ranges from 1.8 to 6.9% for each of the hemoglobin species. The mean ± SD global values of HbR, HbO, HbT, StO2 are respectively 24 ± 7 μM, 33.3 ± 9.5 μM, 57.4 ± 15.8 μM, 58 ± 4.2%. StO2 displays the narrowest range of variability across brain regions. PMID:29188096

  7. Nonlinear Gain Saturation in Active Slow Light Photonic Crystal Waveguides

    DEFF Research Database (Denmark)

    Chen, Yaohui; Mørk, Jesper

    2013-01-01

    We present a quantitative three-dimensional analysis of slow-light enhanced traveling wave amplification in an active semiconductor photonic crystal waveguides. The impact of slow-light propagation on the nonlinear gain saturation of the device is investigated.......We present a quantitative three-dimensional analysis of slow-light enhanced traveling wave amplification in an active semiconductor photonic crystal waveguides. The impact of slow-light propagation on the nonlinear gain saturation of the device is investigated....

  8. Simulation of Nonlinear Gain Saturation in Active Photonic Crystal Waveguides

    DEFF Research Database (Denmark)

    Chen, Yaohui; Mørk, Jesper

    2012-01-01

    In this paper we present a theoretical analysis of slowlight enhanced traveling wave amplification in an active semiconductor Photonic crystal waveguides. The impact of group index on nonlinear modal gain saturation is investigated.......In this paper we present a theoretical analysis of slowlight enhanced traveling wave amplification in an active semiconductor Photonic crystal waveguides. The impact of group index on nonlinear modal gain saturation is investigated....

  9. Noise and saturation properties of semiconductor quantum dot optical amplifiers

    DEFF Research Database (Denmark)

    Berg, Tommy Winther; Mørk, Jesper

    2002-01-01

    We present a detailed theoretical analysis of quantum dot optical amplifiers. Due to the presence of a reservoir of wetting layer states, the saturation and noise properties differ markedly from bulk or QW amplifiers and may be significantly improved.......We present a detailed theoretical analysis of quantum dot optical amplifiers. Due to the presence of a reservoir of wetting layer states, the saturation and noise properties differ markedly from bulk or QW amplifiers and may be significantly improved....

  10. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    Science.gov (United States)

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  11. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  12. MARINE-EXPRESS: taking advantage of high throughput cloning and expression strategies for the post-genomic analysis of marine organisms

    Directory of Open Access Journals (Sweden)

    Power Déborah

    2010-06-01

    Full Text Available Abstract Background The production of stable and soluble proteins is one of the most important steps prior to structural and functional studies of biological importance. We investigated the parallel production in a medium throughput strategy of genes coding for proteins from various marine organisms, using protocols that involved recombinatorial cloning, protein expression screening and batch purification. This strategy was applied in order to respond to the need for post-genomic validation of