WorldWideScience

Sample records for saturation throughput analysis

  1. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  2. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  3. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Hong Kezhu

    2007-01-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  4. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Kezhu Hong

    2007-04-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  5. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-01-01

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  6. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  7. Performance Analysis of Non-saturated IEEE 802.11 DCF Networks

    Science.gov (United States)

    Zhai, Linbo; Zhang, Xiaomin; Xie, Gang

    This letter presents a model with queueing theory to analyze the performance of non-saturated IEEE 802.11 DCF networks. We use the closed queueing network model and derive an approximate representation of throughput which can reveal the relationship between the throughput and the total offered load under finite traffic load conditions. The accuracy of the model is verified by extensive simulations.

  8. Max-plus algebraic throughput analysis of synchronous dataflow graphs

    NARCIS (Netherlands)

    de Groote, Robert; Kuper, Jan; Broersma, Haitze J.; Smit, Gerardus Johannes Maria

    2012-01-01

    In this paper we present a novel approach to throughput analysis of synchronous dataflow (SDF) graphs. Our approach is based on describing the evolution of actor firing times as a linear time-invariant system in max-plus algebra. Experimental results indicate that our approach is faster than

  9. Stochastic analysis of radionuclide migration in saturated-unsaturated soils

    International Nuclear Information System (INIS)

    Kawanishi, Moto

    1988-01-01

    In Japan, LLRW (low level radioactive wastes) generated from nuclear power plants shall be started to store concentrically in the Shimokita site from 1990, and those could be transformed into land disposal if the positive safety is confirmed. Therefore, it is hoped that the safety assessment method shall be successed for the land disposal of LLRW. In this study, a stochastic model to analyze the radionuclide migration in saturated-unsaturated soils was constructed. The principal results are summarized as follows. 1) We presented a generalized idea for the modeling of the radionuclide migration in saturated-unsaturated soils as an advective-dispersion phenomena followed by the decay of radionuclides and those adsorption/desorption in soils. 2) Based on the radionuclide migration model mentioned above, we developed a stochastic analysis model on radionuclide migration in saturated-unsaturated soils. 3) From the comparison between the simulated results and the exact solution on a few simple one-dimensional advective-dispersion problems of radionuclides, the good validity of this model was confirmed. 4) From the comparison between the simulated results by this model and the experimental results of radionuclide migration in a one-dimensional unsaturated soil column with rainfall, the good applicability was shown. 5) As the stochastic model such as this has several advantages that it is easily able to represent the image of physical phenomena and has basically no numerical dissipation, this model should be more applicable to the analysis of the complicated radionuclide migration in saturated-unsaturated soils. (author)

  10. Microscopic analysis of saturable absorbers: Semiconductor saturable absorber mirrors versus graphene

    Energy Technology Data Exchange (ETDEWEB)

    Hader, J.; Moloney, J. V. [Nonlinear Control Strategies, Inc., 3542 N. Geronimo Ave., Tucson, Arizona 85705 (United States); College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 (United States); Yang, H.-J.; Scheller, M. [College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 (United States); Koch, S. W. [Department of Physics and Materials Sciences Center, Philipps Universität Marburg, Renthof 5, 35032 Marburg (Germany)

    2016-02-07

    Fully microscopic many-body calculations are used to study the influence of strong sub-picosecond pulses on the carrier distributions and corresponding optical response in saturable absorbers used for mode-locking—semiconductor (quantum well) saturable absorber mirrors (SESAMs) and single layer graphene based saturable absorber mirrors (GSAMs). Unlike in GSAMs, the saturation fluence and recovery time in SESAMs show a strong spectral dependence. While the saturation fluence in the SESAM is minimal at the excitonic bandgap, the optimal recovery time and least pulse distortion due to group delay dispersion are found for excitation higher in the first subband. For excitation near the SESAM bandgap, the saturation fluence is about one tenth of that in the GSAM. At energies above the bandgap, the fluences in both systems become similar. A strong dependence of the saturation fluence on the pulse width in both systems is caused by carrier relaxation during the pulse. The recovery time in graphene is found to be about two to four times faster than that in the SESAMs. The occurrence of negative differential transmission in graphene is shown to be caused by dopant related carriers. In SESAMs, a negative differential transmission is found when exciting below the excitonic resonance where excitation induced dephasing leads to an enhancement of the absorption. Comparisons of the simulation data to the experiment show a very good quantitative agreement.

  11. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  12. Analysis of a Heroin Epidemic Model with Saturated Treatment Function

    Directory of Open Access Journals (Sweden)

    Isaac Mwangi Wangari

    2017-01-01

    Full Text Available A mathematical model is developed that examines how heroin addiction spreads in society. The model is formulated to take into account the treatment of heroin users by incorporating a realistic functional form that “saturates” representing the limited availability of treatment. Bifurcation analysis reveals that the model has an intrinsic backward bifurcation whenever the saturation parameter is larger than a fixed threshold. We are particularly interested in studying the model’s global stability. In the absence of backward bifurcations, Lyapunov functions can often be found and used to prove global stability. However, in the presence of backward bifurcations, such Lyapunov functions may not exist or may be difficult to construct. We make use of the geometric approach to global stability to derive a condition that ensures that the system is globally asymptotically stable. Numerical simulations are also presented to give a more complete representation of the model dynamics. Sensitivity analysis performed by Latin hypercube sampling (LHS suggests that the effective contact rate in the population, the relapse rate of heroin users undergoing treatment, and the extent of saturation of heroin users are mechanisms fuelling heroin epidemic proliferation.

  13. Analysis of an SEIR Epidemic Model with Saturated Incidence and Saturated Treatment Function

    Directory of Open Access Journals (Sweden)

    Jinhong Zhang

    2014-01-01

    Full Text Available The dynamics of SEIR epidemic model with saturated incidence rate and saturated treatment function are explored in this paper. The basic reproduction number that determines disease extinction and disease survival is given. The existing threshold conditions of all kinds of the equilibrium points are obtained. Sufficient conditions are established for the existence of backward bifurcation. The local asymptotical stability of equilibrium is verified by analyzing the eigenvalues and using the Routh-Hurwitz criterion. We also discuss the global asymptotical stability of the endemic equilibrium by autonomous convergence theorem. The study indicates that we should improve the efficiency and enlarge the capacity of the treatment to control the spread of disease. Numerical simulations are presented to support and complement the theoretical findings.

  14. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  15. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  16. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  17. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  18. Analysis of CBRP for UDP and TCP Traffic-Classes to measure throughput in MANETs

    Directory of Open Access Journals (Sweden)

    Hardeep Singh Rayait

    2013-01-01

    Full Text Available In this paper, we analyse the throughput of both TCP and UDP traffic classes for cluster based routing protocol for mobile ad hoc network. It uses clustering structure to improve throughput , decrease average end-to-end delay and improve the average packet delivery ratio. We simulate our routing protocol for nodes running the IEEE802.11 MAC for analysis of throughput for both UDP and TCP traffic classes. The application layer protocol used for UDP is CBR and for TCP is FTP.

  19. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis.

    Science.gov (United States)

    Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren

    2016-11-01

    Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is

  20. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis

    Directory of Open Access Journals (Sweden)

    Yushen Du

    2016-11-01

    Full Text Available Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp, we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available.

  1. VAM2D: Variably saturated analysis model in two dimensions

    International Nuclear Information System (INIS)

    Huyakorn, P.S.; Kool, J.B.; Wu, Y.S.

    1991-10-01

    This report documents a two-dimensional finite element model, VAM2D, developed to simulate water flow and solute transport in variably saturated porous media. Both flow and transport simulation can be handled concurrently or sequentially. The formulation of the governing equations and the numerical procedures used in the code are presented. The flow equation is approximated using the Galerkin finite element method. Nonlinear soil moisture characteristics and atmospheric boundary conditions (e.g., infiltration, evaporation and seepage face), are treated using Picard and Newton-Raphson iterations. Hysteresis effects and anisotropy in the unsaturated hydraulic conductivity can be taken into account if needed. The contaminant transport simulation can account for advection, hydrodynamic dispersion, linear equilibrium sorption, and first-order degradation. Transport of a single component or a multi-component decay chain can be handled. The transport equation is approximated using an upstream weighted residual method. Several test problems are presented to verify the code and demonstrate its utility. These problems range from simple one-dimensional to complex two-dimensional and axisymmetric problems. This document has been produced as a user's manual. It contains detailed information on the code structure along with instructions for input data preparation and sample input and printed output for selected test problems. Also included are instructions for job set up and restarting procedures. 44 refs., 54 figs., 24 tabs

  2. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  3. Saturated and unsaturated stability analysis of slope subjected to rainfall infiltration

    OpenAIRE

    Gofar Nurly; Rahardjo Harianto

    2017-01-01

    This paper presents results of saturated and unsaturated stability analysis of typical residual slopes subjected to rainfall infiltration corresponds to 50 years rainfall return period. The slope angles considered were 45° and 70°. The saturated stability analyses were carried out for original and critical ground water level commonly considered by practicing engineer. The analyses were conducted using limit equilibrium method. Unsaturated stability analyses used combination of coupled stress–...

  4. Analysis of ensembles of moderately saturated interstellar lines

    International Nuclear Information System (INIS)

    Jenkins, E.B.

    1986-01-01

    It is shown that the combined equivalent widths for a large population of Gaussian-like interstellar line components, each with different central optical depths tau(0) and velocity dispersions b, exhibit a curve of growth (COG) which closely mimics that of a single, pure Gaussian distribution in velocity. Two parametric distributions functions for the line populations are considered: a bivariate Gaussian for tau(0) and b and a power law distribution for tau(0) combined with a Gaussian dispersion for b. First, COGs for populations having an extremely large number of nonoverlapping components are derived, and the implications are shown by focusing on the doublet-ratio analysis for a pair of lines whose f-values differ by a factor of two. The consequences of having, instead of an almost infinite number of lines, a relatively small collection of components added together for each member of a doublet are examined. The theory of how the equivalent widths grow for populations of overlapping Gaussian profiles is developed. Examples of the composite COG analysis applied to existing collections of high-resolution interstellar line data are presented. 39 references

  5. High or low oxygen saturation and severe retinopathy of prematurity: a meta-analysis.

    Science.gov (United States)

    Chen, Minghua L; Guo, Lei; Smith, Lois E H; Dammann, Christiane E L; Dammann, Olaf

    2010-06-01

    Low oxygen saturation appears to decrease the risk of severe retinopathy of prematurity (ROP) in preterm newborns when administered during the first few weeks after birth. High oxygen saturation seems to reduce the risk at later postmenstrual ages (PMAs). However, previous clinical studies are not conclusive individually. To perform a systematic review and meta-analysis to report the association between severe ROP incidence of premature infants with high or low target oxygen saturation measured by pulse oximetry. Studies were identified through PubMed and Embase literature searches through May 2009 by using the terms "retinopathy of prematurity and oxygen" or "retinopathy of prematurity and oxygen therapy." We selected 10 publications addressing the association between severe ROP and target oxygen saturation measured by pulse oximetry. Using a random-effects model we calculated the summary-effect estimate. We visually inspected funnel plots to examine possible publication bias. Low oxygen saturation (70%-96%) in the first several postnatal weeks was associated with a reduced risk of severe ROP (risk ratio [RR]: 0.48 [95% confidence interval (CI): 0.31-0.75]). High oxygen saturation (94%-99%) at > or = 32 weeks' PMA was associated with a decreased risk for progression to severe ROP (RR: 0.54 [95% CI: 0.35-0.82]). Among preterm infants with a gestational age of large randomized clinical trial with long-term developmental follow-up is warranted to confirm this meta-analytic result.

  6. Throughput and Delay Analysis of HARQ with Code Combining over Double Rayleigh Fading Channels

    KAUST Repository

    Chelli, Ali

    2018-01-15

    This paper proposes the use of hybrid automatic repeat request (HARQ) with code combining (HARQ-CC) to offer reliable communications over double Rayleigh channels. The double Rayleigh fading channel is of particular interest to vehicle-to-vehicle communication systems as well as amplify-and-forward relaying and keyhole channels. This work studies the performance of HARQ-CC over double Rayleigh channels from an information theoretic perspective. Analytical approximations are derived for the $\\\\epsilon$-outage capacity, the average number of transmissions, and the throughput of HARQ-CC. Moreover, we evaluate the delay experienced by Poisson arriving packets for HARQ-CC. We provide analytical expressions for the average waiting time, the packets sojourn time, the average consumed power, and the energy efficiency. In our investigation, we take into account the impact of imperfect feedback on different performance metrics. Additionally, we explore the tradeoff between energy efficiency and the throughput. The proposed scheme is shown to maintain the outage probability below a specified threshold $\\\\epsilon$ which ensures the link reliability. Meanwhile, HARQ-CC adapts implicitly the transmission rate to the channel conditions such that the throughput is maximized. Our results demonstrate that HARQ-CC allows improving the achievable communication rate compared to fixed time diversity schemes. To maximize the throughput of HARQ-CC, the rate per HARQ round should be less than the rate required to meet the outage constraint. Our investigation of the performance of HARQ-CC over Rayleigh and double Rayleigh channels shows that double Rayleigh channels have a higher severity of fading and result in a larger degradation of the throughput. Our analysis reveals that HARQ with incremental redundancy (HARQ-IR) achieves a larger throughput compared to HARQ-CC, while HARQ-CC is simpler to implement, has a lower decoding

  7. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  8. Saturation analysis

    International Nuclear Information System (INIS)

    1974-01-01

    The invention comprises a radioimmunoassay kit for steroid determination. Selenium-75 is used as labelling element. The chemical preparation methods for various selenium-labelled keto-steroids and their derivatives, such as hydrocortisone, testosteron, corticosteron, estriol, and other steroid hormones as well as cardiacal glycosides are described. Analytical examples are presented

  9. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  10. Urban Saturated Power Load Analysis Based on a Novel Combined Forecasting Model

    Directory of Open Access Journals (Sweden)

    Huiru Zhao

    2015-03-01

    Full Text Available Analysis of urban saturated power loads is helpful to coordinate urban power grid construction and economic social development. There are two different kinds of forecasting models: the logistic curve model focuses on the growth law of the data itself, while the multi-dimensional forecasting model considers several influencing factors as the input variables. To improve forecasting performance, a novel combined forecasting model for saturated power load analysis was proposed in this paper, which combined the above two models. Meanwhile, the weights of these two models in the combined forecasting model were optimized by employing a fruit fly optimization algorithm. Using Hubei Province as the example, the effectiveness of the proposed combined forecasting model was verified, demonstrating a higher forecasting accuracy. The analysis result shows that the power load of Hubei Province will reach saturation in 2039, and the annual maximum power load will reach about 78,630 MW. The results obtained from this proposed hybrid urban saturated power load analysis model can serve as a reference for sustainable development for urban power grids, regional economies, and society at large.

  11. Frequency domain performance analysis of marginally stable LTI systems with saturation

    NARCIS (Netherlands)

    Berg, van den R.A.; Pogromski, A.Y.; Rooda, J.E.; Leonov, G.; Nijmeijer, H.; Pogromsky, A.; Fradkov, A.

    2009-01-01

    In this paper we discuss the frequency domain performance analysis of a marginally stable linear time-invariant (LTI) system with saturation in the feedback loop. We present two methods, both based on the notion of convergent systems, that allow to evaluate the performance of this type of systems in

  12. Use of azeotropic distillation for isotopic analysis of deuterium in soil water and saturate saline solution

    International Nuclear Information System (INIS)

    Santos, Antonio Vieira dos.

    1995-05-01

    The azeotropic distillation technique was adapted to extract soil water and saturate saline solution, which is similar to the sea water for the Isotopic Determination of Deuterium (D). A soil test was used to determine the precision and the nature of the methodology to extract soil water for stable isotopic analysis, using the azeotropic distillation and comparing with traditional methodology of heating under vacuum. This methodology has been very useful for several kinds of soil or saturate saline solution. The apparatus does not have a memory effect, and the chemical reagents do not affect the isotopic composition of soil water. (author). 43 refs., 10 figs., 12 tabs

  13. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  14. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  15. High Throughput Analysis of Breast Cancer Specimens on the Grid

    OpenAIRE

    Yang, Lin; Chen, Wenjin; Meer, Peter; Salaru, Gratian; Feldman, Michael D.; Foran, David J.

    2007-01-01

    Breast cancer accounts for about 30% of all cancers and 15% of all cancer deaths in women in the United States. Advances in computer assisted diagnosis (CAD) holds promise for early detecting and staging disease progression. In this paper we introduce a Grid-enabled CAD to perform automatic analysis of imaged histopathology breast tissue specimens. More than 100,000 digitized samples (1200 × 1200 pixels) have already been processed on the Grid. We have analyzed results for 3744 breast tissue ...

  16. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  17. Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis

    Science.gov (United States)

    Mohamed Ismael, Hawa; Vandyck, George Kobina

    The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.

  18. Movement of Fuel Ashore: Storage, Capacity, Throughput, and Distribution Analysis

    Science.gov (United States)

    2015-12-01

    in MPEM, as the basis for analysis. In keeping with the spirit of EF21 and 33 seabasing concepts, this approach assumes that all other combat...PLTSQ03TM 2 GCE 1 1 INF BN 1 (SL.R’)CO B 1ST Pl. T SQO 3 TM 3 GCE , 1 INF BN 1 (Sl.FIF)CO B2Ml Pl. T HJS(J) GCE 1 1 INF BN 1 (SI.H)CO B2 >1l Pl T SCiO 1 HQ...TM GCE 1 1 INF BN 1 (Sl.R’)CO B2NJ Pl. T SOl 1 TM 1 GCE 1 1 INF BN 1 (SI.H)CO B2 >1l Pl T SCiO 1 TM 2 GCE 1 1 INF BN 1 (SURF) CO B2NJ Pl. T SQ) 1 TM

  19. Regulatory pathway analysis by high-throughput in situ hybridization.

    Directory of Open Access Journals (Sweden)

    Axel Visel

    2007-10-01

    Full Text Available Automated in situ hybridization enables the construction of comprehensive atlases of gene expression patterns in mammals. Such atlases can become Web-searchable digital expression maps of individual genes and thus offer an entryway to elucidate genetic interactions and signaling pathways. Towards this end, an atlas housing approximately 1,000 spatial gene expression patterns of the midgestation mouse embryo was generated. Patterns were textually annotated using a controlled vocabulary comprising >90 anatomical features. Hierarchical clustering of annotations was carried out using distance scores calculated from the similarity between pairs of patterns across all anatomical structures. This process ordered hundreds of complex expression patterns into a matrix that reflects the embryonic architecture and the relatedness of patterns of expression. Clustering yielded 12 distinct groups of expression patterns. Because of the similarity of expression patterns within a group, members of each group may be components of regulatory cascades. We focused on the group containing Pax6, an evolutionary conserved transcriptional master mediator of development. Seventeen of the 82 genes in this group showed a change of expression in the developing neocortex of Pax6-deficient embryos. Electromobility shift assays were used to test for the presence of Pax6-paired domain binding sites. This led to the identification of 12 genes not previously known as potential targets of Pax6 regulation. These findings suggest that cluster analysis of annotated gene expression patterns obtained by automated in situ hybridization is a novel approach for identifying components of signaling cascades.

  20. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules.

    Directory of Open Access Journals (Sweden)

    Antonino Ingargiola

    Full Text Available We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions.

  1. High-throughput phenotyping allows for QTL analysis of defense, symbiosis and development-related traits

    DEFF Research Database (Denmark)

    Hansen, Nina Eberhardtsen

    -throughput phenotyping of whole plants. Additionally, a system for automated confocal microscopy aiming at automated detection of infection thread formation as well as detection of lateral root and nodule primordia is being developed. The objective was to use both systems in genome wide association studies and mutant...... the analysis. Additional phenotyping of defense mutants revealed that MLO, which confers susceptibility towards Blumeria graminis in barley, is also a prime candidate for a S. trifoliorum susceptibility gene in Lotus....

  2. Targeted DNA Methylation Analysis by High Throughput Sequencing in Porcine Peri-attachment Embryos

    OpenAIRE

    MORRILL, Benson H.; COX, Lindsay; WARD, Anika; HEYWOOD, Sierra; PRATHER, Randall S.; ISOM, S. Clay

    2013-01-01

    Abstract The purpose of this experiment was to implement and evaluate the effectiveness of a next-generation sequencing-based method for DNA methylation analysis in porcine embryonic samples. Fourteen discrete genomic regions were amplified by PCR using bisulfite-converted genomic DNA derived from day 14 in vivo-derived (IVV) and parthenogenetic (PA) porcine embryos as template DNA. Resulting PCR products were subjected to high-throughput sequencing using the Illumina Genome Analyzer IIx plat...

  3. Saturated and unsaturated stability analysis of slope subjected to rainfall infiltration

    Directory of Open Access Journals (Sweden)

    Gofar Nurly

    2017-01-01

    Full Text Available This paper presents results of saturated and unsaturated stability analysis of typical residual slopes subjected to rainfall infiltration corresponds to 50 years rainfall return period. The slope angles considered were 45° and 70°. The saturated stability analyses were carried out for original and critical ground water level commonly considered by practicing engineer. The analyses were conducted using limit equilibrium method. Unsaturated stability analyses used combination of coupled stress–pore-water pressure analysis to evaluate the effect of rainfall infiltration on the deformation and transient pore-water pressure on slope stability. Slope stability analyses were performed at some times during and after rainfall infiltration. Results show that the critical condition for slope made by sandy material was at the end of rainfall while for clayey material was at some specified times after the rainfall ceased. Unsaturated stability analysis on sandy soil gives higher factor of safety because the soil never reached saturation. Transient analysis using unsaturated soil concept could predict more critical condition of delayed failure of slopes made up of clayey soil.

  4. Throughput and delay analysis of IEEE 802.15.6-based CSMA/CA protocol.

    Science.gov (United States)

    Ullah, Sana; Chen, Min; Kwak, Kyung Sup

    2012-12-01

    The IEEE 802.15.6 is a new communication standard on Wireless Body Area Network (WBAN) that focuses on a variety of medical, Consumer Electronics (CE) and entertainment applications. In this paper, the throughput and delay performance of the IEEE 802.15.6 is presented. Numerical formulas are derived to determine the maximum throughput and minimum delay limits of the IEEE 802.15.6 for an ideal channel with no transmission errors. These limits are derived for different frequency bands and data rates. Our analysis is validated by extensive simulations using a custom C+ + simulator. Based on analytical and simulation results, useful conclusions are derived for network provisioning and packet size optimization for different applications.

  5. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  6. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  7. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  8. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  9. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  10. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  11. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  12. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  13. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  14. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  15. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  16. Performance Analysis of IEEE 802.11 DCF and IEEE 802.11e EDCA in Non-saturation Condition

    Science.gov (United States)

    Kim, Tae Ok; Kim, Kyung Jae; Choi, Bong Dae

    We analyze the MAC performance of the IEEE 802.11 DCF and 802.11e EDCA in non-saturation condition where device does not have packets to transmit sometimes. We assume that a flow is not generated while the previous flow is in service and the number of packets in a flow is geometrically distributed. In this paper, we take into account the feature of non-saturation condition in standards: possibility of transmission performed without preceding backoff procedure for the first packet arriving at the idle station. Our approach is to model a stochastic behavior of one station as a discrete time Markov chain. We obtain four performance measures: normalized channel throughput, average packet HoL (head of line) delay, expected time to complete transmission of a flow and packet loss probability. Our results can be used for admission control to find the optimal number of stations with some constraints on these measures.

  17. Seismic response analysis of the deep saturated soil deposits in Shanghai

    Science.gov (United States)

    Huang, Yu; Ye, Weimin; Chen, Zhuchang

    2009-01-01

    The quaternary deposits in Shanghai are horizontal soil layers of thickness up to about 280 m in the urban area with an annual groundwater table between 0.5 and 0.7 m from the surface. The characteristics of deep saturated deposits may have important influences upon seismic response of the ground in Shanghai. Based on the Biot theory for porous media, the water-saturated soil deposits are modeled as a two-phase porous system consisting of solid and fluid phases, in this paper. A nonlinear constitutive model for predicting the seismic response of the ground is developed to describe the dynamic characters of the deep-saturated soil deposits in Shanghai. Subsequently, the seismic response of a typical site with 280 m deep soil layers, which is subjected to four base excitations (El Centro, Taft, Sunan, and Tangshan earthquakes), is analyzed in terms of an effective stress-based finite element method with the proposed constitutive model. Special emphasis is given to the computed results of accelerations, excess pore-water pressures, and settlements during the seismic excitations. It has been found that the analysis can capture fundamental aspects of the ground response and produce preliminary results for seismic assessment.

  18. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  19. Improvements and impacts of GRCh38 human reference on high throughput sequencing data analysis.

    Science.gov (United States)

    Guo, Yan; Dai, Yulin; Yu, Hui; Zhao, Shilin; Samuels, David C; Shyr, Yu

    2017-03-01

    Analyses of high throughput sequencing data starts with alignment against a reference genome, which is the foundation for all re-sequencing data analyses. Each new release of the human reference genome has been augmented with improved accuracy and completeness. It is presumed that the latest release of human reference genome, GRCh38 will contribute more to high throughput sequencing data analysis by providing more accuracy. But the amount of improvement has not yet been quantified. We conducted a study to compare the genomic analysis results between the GRCh38 reference and its predecessor GRCh37. Through analyses of alignment, single nucleotide polymorphisms, small insertion/deletions, copy number and structural variants, we show that GRCh38 offers overall more accurate analysis of human sequencing data. More importantly, GRCh38 produced fewer false positive structural variants. In conclusion, GRCh38 is an improvement over GRCh37 not only from the genome assembly aspect, but also yields more reliable genomic analysis results. Copyright © 2017. Published by Elsevier Inc.

  20. Quantitative analysis of treatment process time and throughput capacity for spot scanning proton therapy

    International Nuclear Information System (INIS)

    Suzuki, Kazumichi; Sahoo, Narayan; Zhang, Xiaodong; Poenisch, Falk; Mackin, Dennis S.; Liu, Amy Y.; Wu, Richard; Zhu, X. Ronald; Gillin, Michael T.; Palmer, Matthew B.; Frank, Steven J.; Lee, Andrew K.

    2016-01-01

    Purpose: To determine the patient throughput and the overall efficiency of the spot scanning system by analyzing treatment time, equipment availability, and maximum daily capacity for the current spot scanning port at Proton Therapy Center Houston and to assess the daily throughput capacity for a hypothetical spot scanning proton therapy center. Methods: At their proton therapy center, the authors have been recording in an electronic medical record system all treatment data, including disease site, number of fields, number of fractions, delivered dose, energy, range, number of spots, and number of layers for every treatment field. The authors analyzed delivery system downtimes that had been recorded for every equipment failure and associated incidents. These data were used to evaluate the patient census, patient distribution as a function of the number of fields and total target volume, and equipment clinical availability. The duration of each treatment session from patient walk-in to patient walk-out of the spot scanning treatment room was measured for 64 patients with head and neck, central nervous system, thoracic, and genitourinary cancers. The authors retrieved data for total target volume and the numbers of layers and spots for all fields from treatment plans for a total of 271 patients (including the above 64 patients). A sensitivity analysis of daily throughput capacity was performed by varying seven parameters in a throughput capacity model. Results: The mean monthly equipment clinical availability for the spot scanning port in April 2012–March 2015 was 98.5%. Approximately 1500 patients had received spot scanning proton therapy as of March 2015. The major disease sites treated in September 2012–August 2014 were the genitourinary system (34%), head and neck (30%), central nervous system (21%), and thorax (14%), with other sites accounting for the remaining 1%. Spot scanning beam delivery time increased with total target volume and accounted for

  1. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  2. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    Science.gov (United States)

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  4. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    International Nuclear Information System (INIS)

    Tucci, P.

    2001-01-01

    This Analysis/Model Report (AMR) documents an updated analysis of water-level data performed to provide the saturated-zone, site-scale flow and transport model (CRWMS M and O 2000) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for model calibration. The previous analysis was presented in ANL-NBS-HS-000034, Rev 00 ICN 01, Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model (USGS 2001). This analysis is designed to use updated water-level data as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain. The objectives of this revision are to develop computer files containing (1) water-level data within the model area (DTN: GS010908312332.002), (2) a table of known vertical head differences (DTN: GS0109083 12332.003), and (3) a potentiometric-surface map (DTN: GS010608312332.001) using an alternate concept from that presented in ANL-NBS-HS-000034, Rev 00 ICN 01 for the area north of Yucca Mountain. The updated water-level data include data obtained from the Nye County Early Warning Drilling Program (EWDP) and data from borehole USW WT-24. In addition to being utilized by the SZ site-scale flow and transport model, the water-level data and potentiometric-surface map contained within this report will be available to other government agencies and water users for ground-water management purposes. The potentiometric surface defines an upper boundary of the site-scale flow model, as well as provides information useful to estimation of the magnitude and direction of lateral ground-water flow within the flow system. Therefore, the analysis documented in this revision is important to SZ flow and transport calculations in support of total system performance assessment

  5. Theoretical analysis of saturation and limit cycles in short pulse FEL oscillators

    Energy Technology Data Exchange (ETDEWEB)

    Piovella, N.; Chaix, P.; Jaroszynski, D. [Commissariat a l`Energie Atomique, Bruyeres-le-Chatel (France)] [and others

    1995-12-31

    We derive a model for the non linear evolution of a short pulse oscillator from low signal up to saturation in the small gain regime. This system is controlled by only two independent parameters: cavity detuning and losses. Using a closure relation, this model reduces to a closed set of 5 non linear partial differential equations for the EM field and moments of the electron distribution. An analysis of the linearised system allows to define and calculate the eigenmodes characterising the small signal regime. An arbitrary solution of the complete nonlinear system can then be expanded in terms of these eigenmodes. This allows interpreting various observed nonlinear behaviours, including steady state saturation, limit cycles, and transition to chaos. The single mode approximation reduces to a Landau-Ginzburg equation. It allows to obtain gain, nonlinear frequency shift, and efficiency as functions of cavity detuning and cavity losses. A generalisation to two modes allows to obtain a simple description of the limit cycle behaviour, as a competition between these two modes. An analysis of the transitions to more complex dynamics is also given. Finally, the analytical results are compared to the experimental data from the FELIX experiment.

  6. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  7. Comparison of scale analysis and numerical simulation for saturated zone convective mixing processes

    International Nuclear Information System (INIS)

    Oldenburg, C.M.

    1998-01-01

    Scale analysis can be used to predict a variety of quantities arising from natural systems where processes are described by partial differential equations. For example, scale analysis can be applied to estimate the effectiveness of convective missing on the dilution of contaminants in groundwater. Scale analysis involves substituting simple quotients for partial derivatives and identifying and equating the dominant terms in an order-of-magnitude sense. For free convection due to sidewall heating of saturated porous media, scale analysis shows that vertical convective velocity in the thermal boundary layer region is proportional to the Rayleigh number, horizontal convective velocity is proportional to the square root of the Rayleigh number, and thermal boundary layer thickness is proportional to the inverse square root of the Rayleigh number. These scale analysis estimates are corroborated by numerical simulations of an idealized system. A scale analysis estimate of mixing time for a tracer mixing by hydrodynamic dispersion in a convection cell also agrees well with numerical simulation for two different Rayleigh numbers. Scale analysis for the heating-from-below scenario produces estimates of maximum velocity one-half as large as the sidewall case. At small values of the Rayleigh number, this estimate is confirmed by numerical simulation. For larger Rayleigh numbers, simulation results suggest maximum velocities are similar to the sidewall heating scenario. In general, agreement between scale analysis estimates and numerical simulation results serves to validate the method of scale analysis. Application is to radioactive repositories

  8. A Perturbation Analysis of Harmonics Generation from Saturated Elements in Power Systems

    Science.gov (United States)

    Kumano, Teruhisa

    Nonlinear phenomena such as saturation in magnetic flux give considerable effects in power system analysis. It is reported that a failure in a real 500kV system triggered islanding operation, where resultant even harmonics caused malfunctions in protective relays. It is also reported that the major origin of this wave distortion is nothing but unidirectional magnetization of the transformer iron core. Time simulation is widely used today to analyze this type of phenomena, but it has basically two shortcomings. One is that the time simulation takes two much computing time in the vicinity of inflection points in the saturation characteristic curve because certain iterative procedure such as N-R (Newton-Raphson) should be used and such methods tend to be caught in an ill conditioned numerical hunting. The other is that such simulation methods sometimes do not help intuitive understanding of the studied phenomenon because the whole nonlinear equations are treated in a matrix form and not properly divided into understandable parts as done in linear systems. This paper proposes a new computation scheme which is based on so called perturbation method. Magnetic saturation in iron cores in a generator and a transformer are taken into account. The proposed method has a special feature against the first shortcoming of the N-R based time simulation method stated above. In the proposed method no iterative process is used to reduce the equation residue but uses perturbation series, which means free from the ill condition problem. Users have only to calculate each perturbation terms one by one until he reaches necessary accuracy. In a numerical example treated in the present paper the first order perturbation can make reasonably high accuracy, which means very fast computing. In numerical study three nonlinear elements are considered. Calculated results are almost identical to the conventional Newton-Raphson based time simulation, which shows the validity of the method. The

  9. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  10. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    Science.gov (United States)

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  11. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  12. Simultaneous analysis of saturated and unsaturated fatty acids present in pequi fruits by capillary electrophoresis

    Directory of Open Access Journals (Sweden)

    Patrícia M. de Castro Barra

    2013-01-01

    Full Text Available In the current study, an alternative method has been proposed for simultaneous analysis of palmitic, stearic, oleic, linoleic, and linolenic acids by capillary zone electrophoresis (CZE using indirect detection. The background electrolyte (BGE used for the analysis of these fatty acids (FAs consisted of 15.0 mmol L−1 NaH2PO4/Na2HPO4 at pH 6.86, 4.0 mmol L−1 SDBS, 8.3 mmol L−1 Brij 35, 45% v/v acetonitrile (can, and 2.1% n-octanol. The FAs quantification of FAs was performed using a response factor approach, which provided a high analytical throughput for the real sample. The CZE method, which was applied successfully for the analysis of pequi pulp, has advantages such as short analysis time, absence of lipid fraction extraction and derivatization steps, and no significant difference in the 95% confidence intervals for FA quantification results, compared to the gas chromatography official method (AOCS Ce 1h-05.

  13. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    Science.gov (United States)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  14. Data reduction for a high-throughput neutron activation analysis system

    International Nuclear Information System (INIS)

    Bowman, W.W.

    1979-01-01

    To analyze samples collected as part of a geochemical survey for the National Uranium Resource Evaluation program, Savannah River Laboratory has installed a high-throughput neutron activation analysis system. As part of that system, computer programs have been developed to reduce raw data to elemental concentrations in two steps. Program RAGS reduces gamma-ray spectra to lists of photopeak energies, peak areas, and statistical errors. Program RICHES determines the elemental concentrations from photopeak and delayed-neutron data, detector efficiencies, analysis parameters (neutron flux and activation, decay, and counting times), and spectrometric and cross-section data from libraries. Both programs have been streamlined for on-line operation with a minicomputer, each requiring approx. 64 kbytes of core. 3 tables

  15. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  16. Analysis of saturation effects on the operation of magnetic-controlled switcher type FCL

    Directory of Open Access Journals (Sweden)

    Faramarz Faghihi

    2009-12-01

    Full Text Available With the extensive application of electrical power system, suppression of fault current limiter is an important subject that guarantees system security. The superconducting fault current limiters (SFCL have been expected as a possible type of power apparatus to reduce the fault current in the power system. The results shown that under normal state, the FCL has no obvious effect on the power system; under fault state, the current limiting inductance connected in the bias current will be inserted into the fault circuit to limit the fault current. By regulating the bias current, the FCL voltage loss under normal state and the fault current can be adjusted to prescribed level. This kind of SFCL used the nonlinear permeability of the magnetic core for create a sufficient impedance and The transient performance considering the magnetic saturation is analyzed by Preisach model. Preisach model that intrinsically satisfies nonlinear properties is used as the numerical method for analysis of saturation effects. It is able to identification isotropic and no isotropic behaviour. The main idea is to compute the magnetization vector in two steps independently, amplitude and phase. The described model yield results in qualitative agreement with the experimental results.

  17. Saturation analysis studies of corticosteroid levels in normal Greek subjects and in subjects with haemolytic diseases

    International Nuclear Information System (INIS)

    Vyzantiadis, A.

    1975-07-01

    Between 1970 and 1974 a saturation analysis for cortisol in plasma and free cortisol in urine, and a radioimmunoassay method for aldosterone in plasma and urine were developed. In order to permit a comparative evaluation it was necessary to study corticosteroids, diurnal rhythm and the probable effect of a siesta on this rhythm both in normal subjects and in patients suffering from hemic diseases, in particular from sickle-cell anemia. Saturation assay for cortisol, using serum from pregnant women as source of transcortin, and radioimmunoassay for aldosterone were the basic methods used. Serum cortisol was estimated twice a day (8-9 a.m. and 5-6 p.m.). Cortisol and aldosterone were also estimated in serum and in urine before and after adrenalin stimulation with ACTH. No significant influence of a siesta on the diurnal rhythm of cortisol was observed, nor did the levels of serum cortisol or the diurnal rhythm appear affected in congenital hemolytic anemias, following adrenalin stimulation. The report lists experimental results briefly and refers to a paper in which these are published in more detail

  18. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  19. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  20. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  1. High-throughput analysis of amino acids in plant materials by single quadrupole mass spectrometry

    DEFF Research Database (Denmark)

    Dahl-Lassen, Rasmus; van Hecke, Jan Julien Josef; Jørgensen, Henning

    2018-01-01

    that it is very time consuming with typical chromatographic run times of 70 min or more. Results: We have here developed a high-throughput method for analysis of amino acid profiles in plant materials. The method combines classical protein hydrolysis and derivatization with fast separation by UHPLC and detection...... reducing the overall analytical costs compared to methods based on more advanced mass spectrometers....... by a single quadrupole (QDa) mass spectrometer. The chromatographic run time is reduced to 10 min and the precision, accuracy and sensitivity of the method are in line with other recent methods utilizing advanced and more expensive mass spectrometers. The sensitivity of the method is at least a factor 10...

  2. GUItars: a GUI tool for analysis of high-throughput RNA interference screening data.

    Directory of Open Access Journals (Sweden)

    Asli N Goktug

    Full Text Available High-throughput RNA interference (RNAi screening has become a widely used approach to elucidating gene functions. However, analysis and annotation of large data sets generated from these screens has been a challenge for researchers without a programming background. Over the years, numerous data analysis methods were produced for plate quality control and hit selection and implemented by a few open-access software packages. Recently, strictly standardized mean difference (SSMD has become a widely used method for RNAi screening analysis mainly due to its better control of false negative and false positive rates and its ability to quantify RNAi effects with a statistical basis. We have developed GUItars to enable researchers without a programming background to use SSMD as both a plate quality and a hit selection metric to analyze large data sets.The software is accompanied by an intuitive graphical user interface for easy and rapid analysis workflow. SSMD analysis methods have been provided to the users along with traditionally-used z-score, normalized percent activity, and t-test methods for hit selection. GUItars is capable of analyzing large-scale data sets from screens with or without replicates. The software is designed to automatically generate and save numerous graphical outputs known to be among the most informative high-throughput data visualization tools capturing plate-wise and screen-wise performances. Graphical outputs are also written in HTML format for easy access, and a comprehensive summary of screening results is written into tab-delimited output files.With GUItars, we demonstrated robust SSMD-based analysis workflow on a 3840-gene small interfering RNA (siRNA library and identified 200 siRNAs that increased and 150 siRNAs that decreased the assay activities with moderate to stronger effects. GUItars enables rapid analysis and illustration of data from large- or small-scale RNAi screens using SSMD and other traditional analysis

  3. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Directory of Open Access Journals (Sweden)

    Soichi Inagaki

    Full Text Available Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  4. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Science.gov (United States)

    Inagaki, Soichi; Henry, Isabelle M; Lieberman, Meric C; Comai, Luca

    2015-01-01

    Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  5. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists. © 2016 by The Author(s) 2016.

  6. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    Science.gov (United States)

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  7. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    International Nuclear Information System (INIS)

    K. Rehfeldt

    2004-01-01

    This report is an updated analysis of water-level data performed to provide the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]) (referred to as the saturated zone (SZ) site-scale flow model or site-scale SZ flow model in this report) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for calibration of groundwater flow models. This report also contains an expanded discussion of uncertainty in the potentiometric-surface map. The analysis of the potentiometric data presented in Revision 00 of this report (USGS 2001 [DIRS 154625]) provides the configuration of the potentiometric surface, target heads, and hydraulic gradients for the calibration of the SZ site-scale flow model (BSC 2004 [DIRS 170037]). Revision 01 of this report (USGS 2004 [DIRS 168473]) used updated water-level data for selected wells through the year 2000 as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain based on an alternative interpretation of perched water conditions. That revision developed computer files containing: Water-level data within the model area (DTN: GS010908312332.002); A table of known vertical head differences (DTN: GS010908312332.003); and A potentiometric-surface map (DTN: GS010608312332.001) using an alternative concept from that presented by USGS (2001 [DIRS 154625]) for the area north of Yucca Mountain. The updated water-level data presented in USGS (2004 [DIRS 168473]) include data obtained from the Nye County Early Warning Drilling Program (EWDP) Phases I and II and data from Borehole USW WT-24. This document is based on Revision 01 (USGS 2004 [DIRS 168473]) and expands the discussion of uncertainty in the potentiometric-surface map. This uncertainty assessment includes an analysis of the impact of more recent water-level data and the impact of adding data from the EWDP Phases III and IV wells. In addition to being utilized

  8. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rehfeldt

    2004-10-08

    This report is an updated analysis of water-level data performed to provide the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]) (referred to as the saturated zone (SZ) site-scale flow model or site-scale SZ flow model in this report) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for calibration of groundwater flow models. This report also contains an expanded discussion of uncertainty in the potentiometric-surface map. The analysis of the potentiometric data presented in Revision 00 of this report (USGS 2001 [DIRS 154625]) provides the configuration of the potentiometric surface, target heads, and hydraulic gradients for the calibration of the SZ site-scale flow model (BSC 2004 [DIRS 170037]). Revision 01 of this report (USGS 2004 [DIRS 168473]) used updated water-level data for selected wells through the year 2000 as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain based on an alternative interpretation of perched water conditions. That revision developed computer files containing: Water-level data within the model area (DTN: GS010908312332.002); A table of known vertical head differences (DTN: GS010908312332.003); and A potentiometric-surface map (DTN: GS010608312332.001) using an alternative concept from that presented by USGS (2001 [DIRS 154625]) for the area north of Yucca Mountain. The updated water-level data presented in USGS (2004 [DIRS 168473]) include data obtained from the Nye County Early Warning Drilling Program (EWDP) Phases I and II and data from Borehole USW WT-24. This document is based on Revision 01 (USGS 2004 [DIRS 168473]) and expands the discussion of uncertainty in the potentiometric-surface map. This uncertainty assessment includes an analysis of the impact of more recent water-level data and the impact of adding data from the EWDP Phases III and IV wells. In

  9. An Analysis and Design for Nonlinear Quadratic Systems Subject to Nested Saturation

    Directory of Open Access Journals (Sweden)

    Minsong Zhang

    2013-01-01

    Full Text Available This paper considers the stability problem for nonlinear quadratic systems with nested saturation input. The interesting treatment method proposed to nested saturation here is put into use a well-established linear differential control tool. And the new conclusions include the existing conclusion on this issue and have less conservatism than before. Simulation example illustrates the effectiveness of the established methodologies.

  10. Torque Analysis With Saturation Effects for Non-Salient Single-Phase Permanent-Magnet Machines

    DEFF Research Database (Denmark)

    Lu, Kaiyuan; Ritchie, Ewen

    2011-01-01

    The effects of saturation on torque production for non-salient, single-phase, permanent-magnet machines are studied in this paper. An analytical torque equation is proposed to predict the instantaneous torque with saturation effects. Compared to the existing methods, it is computationally faster......-element results, and experimental results obtained on a prototype single-phase permanent-magnet machine....

  11. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    Science.gov (United States)

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  12. High-Throughput Quantitative Proteomic Analysis of Dengue Virus Type 2 Infected A549 Cells

    Science.gov (United States)

    Chiu, Han-Chen; Hannemann, Holger; Heesom, Kate J.; Matthews, David A.; Davidson, Andrew D.

    2014-01-01

    Disease caused by dengue virus is a global health concern with up to 390 million individuals infected annually worldwide. There are no vaccines or antiviral compounds available to either prevent or treat dengue disease which may be fatal. To increase our understanding of the interaction of dengue virus with the host cell, we analyzed changes in the proteome of human A549 cells in response to dengue virus type 2 infection using stable isotope labelling in cell culture (SILAC) in combination with high-throughput mass spectrometry (MS). Mock and infected A549 cells were fractionated into nuclear and cytoplasmic extracts before analysis to identify proteins that redistribute between cellular compartments during infection and reduce the complexity of the analysis. We identified and quantified 3098 and 2115 proteins in the cytoplasmic and nuclear fractions respectively. Proteins that showed a significant alteration in amount during infection were examined using gene enrichment, pathway and network analysis tools. The analyses revealed that dengue virus infection modulated the amounts of proteins involved in the interferon and unfolded protein responses, lipid metabolism and the cell cycle. The SILAC-MS results were validated for a select number of proteins over a time course of infection by Western blotting and immunofluorescence microscopy. Our study demonstrates for the first time the power of SILAC-MS for identifying and quantifying novel changes in cellular protein amounts in response to dengue virus infection. PMID:24671231

  13. High-throughput quantitative proteomic analysis of dengue virus type 2 infected A549 cells.

    Directory of Open Access Journals (Sweden)

    Han-Chen Chiu

    Full Text Available Disease caused by dengue virus is a global health concern with up to 390 million individuals infected annually worldwide. There are no vaccines or antiviral compounds available to either prevent or treat dengue disease which may be fatal. To increase our understanding of the interaction of dengue virus with the host cell, we analyzed changes in the proteome of human A549 cells in response to dengue virus type 2 infection using stable isotope labelling in cell culture (SILAC in combination with high-throughput mass spectrometry (MS. Mock and infected A549 cells were fractionated into nuclear and cytoplasmic extracts before analysis to identify proteins that redistribute between cellular compartments during infection and reduce the complexity of the analysis. We identified and quantified 3098 and 2115 proteins in the cytoplasmic and nuclear fractions respectively. Proteins that showed a significant alteration in amount during infection were examined using gene enrichment, pathway and network analysis tools. The analyses revealed that dengue virus infection modulated the amounts of proteins involved in the interferon and unfolded protein responses, lipid metabolism and the cell cycle. The SILAC-MS results were validated for a select number of proteins over a time course of infection by Western blotting and immunofluorescence microscopy. Our study demonstrates for the first time the power of SILAC-MS for identifying and quantifying novel changes in cellular protein amounts in response to dengue virus infection.

  14. Gene Expression Analysis of Escherichia Coli Grown in Miniaturized Bioreactor Platforms for High-Throughput Analysis of Growth and genomic Data

    DEFF Research Database (Denmark)

    Boccazzi, P.; Zanzotto, A.; Szita, Nicolas

    2005-01-01

    Combining high-throughput growth physiology and global gene expression data analysis is of significant value for integrating metabolism and genomics. We compared global gene expression using 500 ng of total RNA from Escherichia coli cultures grown in rich or defined minimal media in a miniaturize...... cultures using just 500 ng of total RNA indicate that high-throughput integration of growth physiology and genomics will be possible with novel biochemical platforms and improved detection technologies....

  15. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  16. A high throughput mass spectrometry screening analysis based on two-dimensional carbon microfiber fractionation system.

    Science.gov (United States)

    Ma, Biao; Zou, Yilin; Xie, Xuan; Zhao, Jinhua; Piao, Xiangfan; Piao, Jingyi; Yao, Zhongping; Quinto, Maurizio; Wang, Gang; Li, Donghao

    2017-06-09

    A novel high-throughput, solvent saving and versatile integrated two-dimensional microscale carbon fiber/active carbon fiber system (2DμCFs) that allows a simply and rapid separation of compounds in low-polar, medium-polar and high-polar fractions, has been coupled with ambient ionization-mass spectrometry (ESI-Q-TOF-MS and ESI-QqQ-MS) for screening and quantitative analyses of real samples. 2DμCFs led to a substantial interference reduction and minimization of ionization suppression effects, thus increasing the sensitivity and the screening capabilities of the subsequent MS analysis. The method has been applied to the analysis of Schisandra Chinensis extracts, obtaining with a single injection a simultaneous determination of 33 compounds presenting different polarities, such as organic acids, lignans, and flavonoids in less than 7min, at low pressures and using small solvent amounts. The method was also validated using 10 model compounds, giving limit of detections (LODs) ranging from 0.3 to 30ngmL -1 , satisfactory recoveries (from 75.8 to 93.2%) and reproducibilities (relative standard deviations, RSDs, from 1.40 to 8.06%). Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay

    Science.gov (United States)

    Zhong, Xuefeng; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-01

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter–receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter–receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity. PMID:29337911

  18. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  19. High-throughput peptide mass fingerprinting and protein macroarray analysis using chemical printing strategies

    International Nuclear Information System (INIS)

    Sloane, A.J.; Duff, J.L.; Hopwood, F.G.; Wilson, N.L.; Smith, P.E.; Hill, C.J.; Packer, N.H.; Williams, K.L.; Gooley, A.A.; Cole, R.A.; Cooley, P.W.; Wallace, D.B.

    2001-01-01

    We describe a 'chemical printer' that uses piezoelectric pulsing for rapid and accurate microdispensing of picolitre volumes of fluid for proteomic analysis of 'protein macroarrays'. Unlike positive transfer and pin transfer systems, our printer dispenses fluid in a non-contact process that ensures that the fluid source cannot be contaminated by substrate during a printing event. We demonstrate automated delivery of enzyme and matrix solutions for on-membrane protein digestion and subsequent peptide mass fingerprinting (pmf) analysis directly from the membrane surface using matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). This approach bypasses the more commonly used multi-step procedures, thereby permitting a more rapid procedure for protein identification. We also highlight the advantage of printing different chemistries onto an individual protein spot for multiple microscale analyses. This ability is particularly useful when detailed characterisation of rare and valuable sample is required. Using a combination of PNGase F and trypsin we have mapped sites of N-glycosylation using on-membrane digestion strategies. We also demonstrate the ability to print multiple serum samples in a micro-ELISA format and rapidly screen a protein macroarray of human blood plasma for pathogen-derived antigens. We anticipate that the 'chemical printer' will be a major component of proteomic platforms for high-throughput protein identification and characterisation with widespread applications in biomedical and diagnostic discovery

  20. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay.

    Science.gov (United States)

    Zhong, Xuefeng; Chen, Fangjiong; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-16

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter-receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter-receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity.

  1. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  2. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina Lundgaard; Login, Frédéric H.; Jensen, Helene Halkjær

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacteria...

  3. Design and Performance Analysis of Multi-tier Heterogeneous Network through Coverage, Throughput and Energy Efficiency

    Directory of Open Access Journals (Sweden)

    A. Shabbir,

    2017-12-01

    Full Text Available The unprecedented acceleration in wireless industry strongly compels wireless operators to increase their data network throughput, capacity and coverage on emergent basis. In upcoming 5G heterogeneous networks inclusion of low power nodes (LPNs like pico cells and femto cells for increasing network’s throughput, capacity and coverage are getting momentum. Addition of LPNs in such a massive level will eventually make a network populated in terms of base stations (BSs.The dense deployments of BSs will leads towards high operating expenditures (Op-Ex, capital expenditure (Cap-Ex and most importantly high energy consumption in future generation networks. Recognizing theses networks issues this research work investigates data throughput and energy efficiency of 5G multi-tier heterogeneous network. The network is modeled using tools from stochastic geometry. Monte Carlo results confirmed that rational deployment of LPNs can contribute towards increased throughput along with better energy efficiency of overall network.

  4. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  5. Research on combination forecast of port cargo throughput based on time series and causality analysis

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2013-03-01

    Full Text Available Purpose: The purpose of this paper is to develop a combined model composed of grey-forecast model and Logistic-growth-curve model to improve the accuracy of forecast model of cargo throughput for the port. The authors also use the existing data of a current port to verify the validity of the combined model.Design/methodology/approach: A literature review is undertaken to find the appropriate forecast model of cargo throughput for the port. Through researching the related forecast model, the authors put together the individual models which are significant to study further. Finally, the authors combine two individual models (grey-forecast model and Logistic-growth-curve model into one combined model to forecast the port cargo throughput, and use the model to a physical port in China to testify the validity of the model.Findings: Test by the perceptional data of cargo throughput in the physical port, the results show that the combined model can obtain relatively higher forecast accuracy when it is not easy to find more information. Furthermore, the forecast made by the combined model are more accurate than any of the individual ones.Research limitations/implications: The study provided a new combined forecast model of cargo throughput with a relatively less information to improve the accuracy rate of the forecast. The limitation of the model is that it requires the cargo throughput of the port have an S-shaped change trend.Practical implications: This model is not limited by external conditions such as geographical, cultural. This model predicted the port cargo throughput of one real port in China in 2015, which provided some instructive guidance for the port development.Originality/value: This is the one of the study to improve the accuracy rate of the cargo throughput forecast with little information.

  6. Saturated Switching Systems

    CERN Document Server

    Benzaouia, Abdellah

    2012-01-01

    Saturated Switching Systems treats the problem of actuator saturation, inherent in all dynamical systems by using two approaches: positive invariance in which the controller is designed to work within a region of non-saturating linear behaviour; and saturation technique which allows saturation but guarantees asymptotic stability. The results obtained are extended from the linear systems in which they were first developed to switching systems with uncertainties, 2D switching systems, switching systems with Markovian jumping and switching systems of the Takagi-Sugeno type. The text represents a thoroughly referenced distillation of results obtained in this field during the last decade. The selected tool for analysis and design of stabilizing controllers is based on multiple Lyapunov functions and linear matrix inequalities. All the results are illustrated with numerical examples and figures many of them being modelled using MATLAB®. Saturated Switching Systems will be of interest to academic researchers in con...

  7. Fractal analysis of fracture increasing spontaneous imbibition in porous media with gas-saturated

    KAUST Repository

    Cai, Jianchao; Sun, Shuyu

    2013-01-01

    Spontaneous imbibition (SI) of wetting liquid into matrix blocks due to capillary pressure is regarded as an important recovery mechanism in low permeability fractured reservoir. In this paper, an analytical model is proposed for characterizing SI horizontally from a single plane fracture into gas-saturated matrix blocks. The presented model is based on the fractal character of pores in porous matrix, with gravity force included in the entire imbibition process. The accumulated mass of wetting liquid imbibed into matrix blocks is related to a number of factors such as contact area, pore fractal dimension, tortuosity, maximum pore size, porosity, liquid density and viscosity, surface tension, contact angle, as well as height and tilt angle of the fracture. The mechanism of fracture-enhanced SI is analyzed accordingly. Because of the effect of fracture, the gravity force is positive to imbibition process. Additionally, the farther away from the fracture top of the pore, the more influential the hydrostatic pressure is upon the imbibition action. The presented fractal analysis of horizontal spontaneous imbibition from a single fracture could also shed light on the scaling study of the mass transfer function between matrix and fracture system of fractured reservoirs. © 2013 World Scientific Publishing Company.

  8. Fractal analysis of fracture increasing spontaneous imbibition in porous media with gas-saturated

    KAUST Repository

    Cai, Jianchao

    2013-08-01

    Spontaneous imbibition (SI) of wetting liquid into matrix blocks due to capillary pressure is regarded as an important recovery mechanism in low permeability fractured reservoir. In this paper, an analytical model is proposed for characterizing SI horizontally from a single plane fracture into gas-saturated matrix blocks. The presented model is based on the fractal character of pores in porous matrix, with gravity force included in the entire imbibition process. The accumulated mass of wetting liquid imbibed into matrix blocks is related to a number of factors such as contact area, pore fractal dimension, tortuosity, maximum pore size, porosity, liquid density and viscosity, surface tension, contact angle, as well as height and tilt angle of the fracture. The mechanism of fracture-enhanced SI is analyzed accordingly. Because of the effect of fracture, the gravity force is positive to imbibition process. Additionally, the farther away from the fracture top of the pore, the more influential the hydrostatic pressure is upon the imbibition action. The presented fractal analysis of horizontal spontaneous imbibition from a single fracture could also shed light on the scaling study of the mass transfer function between matrix and fracture system of fractured reservoirs. © 2013 World Scientific Publishing Company.

  9. High-throughput genetic analysis in a cohort of patients with Ocular Developmental Anomalies

    Directory of Open Access Journals (Sweden)

    Suganya Kandeeban

    2017-10-01

    Full Text Available Anophthalmia and microphthalmia (A/M are developmental ocular malformations in which the eye fails to form or is smaller than normal with both genetic and environmental etiology. Microphthalmia is often associated with additional ocular anomalies, most commonly coloboma or cataract [1, 2]. A/M has a combined incidence between 1-3.2 cases per 10,000 live births in Caucasians [3, 4]. The spectrum of genetic abnormalities (chromosomal and molecular associated with these ocular developmental defects are being investigated in the current study. A detailed pedigree analysis and ophthalmic examination have been documented for the enrolled patients followed by blood collection and DNA extraction. The strategies for genetic analysis included chromosomal analysis by conventional and array based (affymetrix cytoscan HD array methods, targeted re-sequencing of the candidate genes and whole exome sequencing (WES in Illumina HiSEQ 2500. WES was done in families excluded for mutations in candidate genes. Twenty four samples (Microphthalmia (M-5, Anophthalmia (A-7,Coloboma-2, M&A-1, microphthalmia and coloboma / other ocular features-9 were initially analyzed using conventional Geimsa Trypsin Geimsa banding of which 4 samples revealed gross chromosomal aberrations (deletions in 3q26.3-28, 11p13 (N=2 and 11q23 regions. Targeted re sequencing of candidate genes showed mutations in CHX10, PAX6, FOXE3, ABCB6 and SHH genes in 6 samples. High throughput array based chromosomal analysis revealed aberrations in 4 samples (17q21dup (n=2, 8p11del (n=2. Overall, genetic alterations in known candidate genes are seen in 50% of the study subjects. Whole exome sequencing was performed in samples that were excluded for mutations in candidate genes and the results are discussed.

  10. GiA Roots: software for the high throughput analysis of plant root system architecture

    Science.gov (United States)

    2012-01-01

    Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. Conclusions We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis. PMID:22834569

  11. MultiSense: A Multimodal Sensor Tool Enabling the High-Throughput Analysis of Respiration.

    Science.gov (United States)

    Keil, Peter; Liebsch, Gregor; Borisjuk, Ljudmilla; Rolletschek, Hardy

    2017-01-01

    The high-throughput analysis of respiratory activity has become an important component of many biological investigations. Here, a technological platform, denoted the "MultiSense tool," is described. The tool enables the parallel monitoring of respiration in 100 samples over an extended time period, by dynamically tracking the concentrations of oxygen (O 2 ) and/or carbon dioxide (CO 2 ) and/or pH within an airtight vial. Its flexible design supports the quantification of respiration based on either oxygen consumption or carbon dioxide release, thereby allowing for the determination of the physiologically significant respiratory quotient (the ratio between the quantities of CO 2 released and the O 2 consumed). It requires an LED light source to be mounted above the sample, together with a CCD camera system, adjusted to enable the capture of analyte-specific wavelengths, and fluorescent sensor spots inserted into the sample vial. Here, a demonstration is given of the use of the MultiSense tool to quantify respiration in imbibing plant seeds, for which an appropriate step-by-step protocol is provided. The technology can be easily adapted for a wide range of applications, including the monitoring of gas exchange in any kind of liquid culture system (algae, embryo and tissue culture, cell suspensions, microbial cultures).

  12. Throughput analysis of the IEEE 802.4 token bus standard under heavy load

    Science.gov (United States)

    Pang, Joseph; Tobagi, Fouad

    1987-01-01

    It has become clear in the last few years that there is a trend towards integrated digital services. Parallel to the development of public Integrated Services Digital Network (ISDN) is service integration in the local area (e.g., a campus, a building, an aircraft). The types of services to be integrated depend very much on the specific local environment. However, applications tend to generate data traffic belonging to one of two classes. According to IEEE 802.4 terminology, the first major class of traffic is termed synchronous, such as packetized voice and data generated from other applications with real-time constraints, and the second class is called asynchronous which includes most computer data traffic such as file transfer or facsimile. The IEEE 802.4 token bus protocol which was designed to support both synchronous and asynchronous traffic is examined. The protocol is basically a timer-controlled token bus access scheme. By a suitable choice of the design parameters, it can be shown that access delay is bounded for synchronous traffic. As well, the bandwidth allocated to asynchronous traffic can be controlled. A throughput analysis of the protocol under heavy load with constant channel occupation of synchronous traffic and constant token-passing times is presented.

  13. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  14. A Data Analysis Pipeline Accounting for Artifacts in Tox21 Quantitative High-Throughput Screening Assays.

    Science.gov (United States)

    Hsieh, Jui-Hua; Sedykh, Alexander; Huang, Ruili; Xia, Menghang; Tice, Raymond R

    2015-08-01

    A main goal of the U.S. Tox21 program is to profile a 10K-compound library for activity against a panel of stress-related and nuclear receptor signaling pathway assays using a quantitative high-throughput screening (qHTS) approach. However, assay artifacts, including nonreproducible signals and assay interference (e.g., autofluorescence), complicate compound activity interpretation. To address these issues, we have developed a data analysis pipeline that includes an updated signal noise-filtering/curation protocol and an assay interference flagging system. To better characterize various types of signals, we adopted a weighted version of the area under the curve (wAUC) to quantify the amount of activity across the tested concentration range in combination with the assay-dependent point-of-departure (POD) concentration. Based on the 32 Tox21 qHTS assays analyzed, we demonstrate that signal profiling using wAUC affords the best reproducibility (Pearson's r = 0.91) in comparison with the POD (0.82) only or the AC(50) (i.e., half-maximal activity concentration, 0.81). Among the activity artifacts characterized, cytotoxicity is the major confounding factor; on average, about 8% of Tox21 compounds are affected, whereas autofluorescence affects less than 0.5%. To facilitate data evaluation, we implemented two graphical user interface applications, allowing users to rapidly evaluate the in vitro activity of Tox21 compounds. © 2015 Society for Laboratory Automation and Screening.

  15. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  16. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    Science.gov (United States)

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  17. Cyber-T web server: differential analysis of high-throughput data.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-07-01

    The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.

  18. WormSizer: high-throughput analysis of nematode size and shape.

    Directory of Open Access Journals (Sweden)

    Brad T Moore

    Full Text Available The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy mutant is short and fat and that a Long (Lon mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc and Roller (Rol mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.

  19. Ontology-based meta-analysis of global collections of high-throughput public data.

    Directory of Open Access Journals (Sweden)

    Ilya Kupershmidt

    2010-09-01

    Full Text Available The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today.We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets.Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  20. ZebraZoom: an automated program for high-throughput behavioral analysis and categorization

    Science.gov (United States)

    Mirat, Olivier; Sternberg, Jenna R.; Severi, Kristen E.; Wyart, Claire

    2013-01-01

    The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorized all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed with four experimenters in 73.2–82.5% of cases. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva–larva interactions occurred as series of escapes. Overall, ZebraZoom reached the level of precision found in manual analysis but accomplished tasks in a high-throughput format necessary for large screens. PMID:23781175

  1. Ontology-based meta-analysis of global collections of high-throughput public data.

    Science.gov (United States)

    Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa

    2010-09-29

    The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  2. Seismic analysis for shroud facility in-pile tube and saturated temperature capsules

    International Nuclear Information System (INIS)

    Iimura, Koichi; Yamaura, Takayuki; Ogawa, Mitsuhiro

    2009-07-01

    At Oarai Research and Development Center, Japan Atomic Energy Agency (JAEA), the plan of repairing and refurbishing Japan Materials Testing Reactor (JMTR) has progressed in order to restart JMTR operation in the fiscal 2011. As a part of effective use of JMTR, the neutron irradiation tests of LWR fuels and materials has been planned in order to study their soundness. By using Oarai Shroud Facility (OSF-1) and Fuel Irradiation Facility with the He-3 gas control system for power lamping test using Boiling Water Capsules (BOCA Irradiation Facility), the irradiation tests with power ramping will be carried out to study the soundness of fuel under LWR Transient condition. OSF-1 is the irradiation facility of shroud type that can insert and eject the capsule under reactor operation, and is composed of 'In-pile Tube', 'Cooling system' and 'Capsule exchange system'. BOCA Irradiation Facility is the facility which simulates irradiation environment of LWR, and is composed of 'Boiling water Capsule', 'Capsule control system' and 'Power control system by He-3'. By using Saturated temperature Capsules and the water environment control system, the material irradiation tests under the water chemistry condition of LWR will be carried out to clarify the mechanism of IASCC. In JMTR, these facilities are in service at the present. However, the detailed design for renewal or remodeling was carried out based on the new design condition in order to be correspondent to the irradiation test plan after restart JMTR operation. In this seismic analysis of the detailed design, each equipment classification and operating state were arranged with 'Japanese technical standards of the structure on nuclear facility for test research' and 'Technical guidelines for seismic design of nuclear power plants on current, and then, stress calculation and evaluation were carried out by FEM piping analysis code 'SAP' and structure analysis code 'ABAQUS'. About the stress of the seismic force, it was proven

  3. elegantRingAnalysis An Interface for High-Throughput Analysis of Storage Ring Lattices Using elegant

    CERN Document Server

    Borland, Michael

    2005-01-01

    The code {\\tt elegant} is widely used for simulation of linacs for drivers for free-electron lasers. Less well known is that elegant is also a very capable code for simulation of storage rings. In this paper, we show a newly-developed graphical user interface that allows the user to easily take advantage of these capabilities. The interface is designed for use on a Linux cluster, providing very high throughput. It can also be used on a single computer. Among the features it gives access to are basic calculations (Twiss parameters, radiation integrals), phase-space tracking, nonlinear dispersion, dynamic aperture (on- and off-momentum), frequency map analysis, and collective effects (IBS, bunch-lengthening). Using a cluster, it is easy to get highly detailed dynamic aperture and frequency map results in a surprisingly short time.

  4. Gold-coated polydimethylsiloxane microwells for high-throughput electrochemiluminescence analysis of intracellular glucose at single cells.

    Science.gov (United States)

    Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng

    2018-06-04

    In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.

  5. Fine grained compositional analysis of Port Everglades Inlet microbiome using high throughput DNA sequencing.

    Science.gov (United States)

    O'Connell, Lauren; Gao, Song; McCorquodale, Donald; Fleisher, Jay; Lopez, Jose V

    2018-01-01

    Similar to natural rivers, manmade inlets connect inland runoff to the ocean. Port Everglades Inlet (PEI) is a busy cargo and cruise ship port in South Florida, which can act as a source of pollution to surrounding beaches and offshore coral reefs. Understanding the composition and fluctuations of bacterioplankton communities ("microbiomes") in major port inlets is important due to potential impacts on surrounding environments. We hypothesize seasonal microbial fluctuations, which were profiled by high throughput 16S rRNA amplicon sequencing and analysis. Surface water samples were collected every week for one year. A total of four samples per month, two from each sampling location, were used for statistical analysis creating a high sampling frequency and finer sampling scale than previous inlet microbiome studies. We observed significant differences in community alpha diversity between months and seasons. Analysis of composition of microbiomes (ANCOM) tests were run in QIIME 2 at genus level taxonomic classification to determine which genera were differentially abundant between seasons and months. Beta diversity results yielded significant differences in PEI community composition in regard to month, season, water temperature, and salinity. Analysis of potentially pathogenic genera showed presence of Staphylococcus and Streptococcus . However, statistical analysis indicated that these organisms were not present in significantly high abundances throughout the year or between seasons. Significant differences in alpha diversity were observed when comparing microbial communities with respect to time. This observation stems from the high community evenness and low community richness in August. This indicates that only a few organisms dominated the community during this month. August had lower than average rainfall levels for a wet season, which may have contributed to less runoff, and fewer bacterial groups introduced into the port surface waters. Bacterioplankton beta

  6. Fine grained compositional analysis of Port Everglades Inlet microbiome using high throughput DNA sequencing

    Directory of Open Access Journals (Sweden)

    Lauren O’Connell

    2018-05-01

    Full Text Available Background Similar to natural rivers, manmade inlets connect inland runoff to the ocean. Port Everglades Inlet (PEI is a busy cargo and cruise ship port in South Florida, which can act as a source of pollution to surrounding beaches and offshore coral reefs. Understanding the composition and fluctuations of bacterioplankton communities (“microbiomes” in major port inlets is important due to potential impacts on surrounding environments. We hypothesize seasonal microbial fluctuations, which were profiled by high throughput 16S rRNA amplicon sequencing and analysis. Methods & Results Surface water samples were collected every week for one year. A total of four samples per month, two from each sampling location, were used for statistical analysis creating a high sampling frequency and finer sampling scale than previous inlet microbiome studies. We observed significant differences in community alpha diversity between months and seasons. Analysis of composition of microbiomes (ANCOM tests were run in QIIME 2 at genus level taxonomic classification to determine which genera were differentially abundant between seasons and months. Beta diversity results yielded significant differences in PEI community composition in regard to month, season, water temperature, and salinity. Analysis of potentially pathogenic genera showed presence of Staphylococcus and Streptococcus. However, statistical analysis indicated that these organisms were not present in significantly high abundances throughout the year or between seasons. Discussion Significant differences in alpha diversity were observed when comparing microbial communities with respect to time. This observation stems from the high community evenness and low community richness in August. This indicates that only a few organisms dominated the community during this month. August had lower than average rainfall levels for a wet season, which may have contributed to less runoff, and fewer bacterial groups

  7. High-Throughput Fabrication of Nanocone Substrates through Polymer Injection Moulding For SERS Analysis in Microfluidic Systems

    DEFF Research Database (Denmark)

    Viehrig, Marlitt; Matteucci, Marco; Thilsted, Anil H.

    analysis. Metal-capped silicon nanopillars, fabricated through a maskless ion etch, are state-of-the-art for on-chip SERS substrates. A dense cluster of high aspect ratio polymer nanocones was achieved by using high-throughput polymer injection moulding over a large area replicating a silicon nanopillar...... structure. Gold-capped polymer nanocones display similar SERS sensitivity as silicon nanopillars, while being easily integrable into a microfluidic chips....

  8. Throughput performance analysis of multirate, multiclass S-ALOHA OFFH-CDMA packet networks

    DEFF Research Database (Denmark)

    Raddo, Thiago R.; Sanches, Anderson L.; Borges, Ben Hur V

    2015-01-01

    In this paper, we propose a new throughput expression for multirate, multiclass slotted-ALOHA optical fast frequency hopping code-division multiple-access (OFFH-CDMA) packet networks considering a Poisson distribution for packet composite arrivals. We analyze the packet throughput performance...... of a three-class OFFH-CDMA network, where multirate transmissions are achieved via manipulation of the user's code parameters. It is shown that users transmitting at low rates interfere considerably in the performance of high rate users. Finally, we perform a validation procedure to demonstrate...

  9. High-throughput analysis of endogenous fruit glycosyl hydrolases using a novel chromogenic hydrogel substrate assay

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Lausen, Thomas Frederik

    2017-01-01

    A broad range of enzyme activities can be found in a wide range of different fruits and fruiting bodies but there is a lack of methods where many samples can be handled in a high-throughput and efficient manner. In particular, plant polysaccharide degrading enzymes – glycosyl hydrolases (GHs) play...... led to a more profound understanding of the importance of GH activity and regulation, current methods for determining glycosyl hydrolase activity are lacking in throughput and fail to keep up with data output from transcriptome research. Here we present the use of a versatile, easy...

  10. Genetic high throughput screening in Retinitis Pigmentosa based on high resolution melting (HRM) analysis.

    Science.gov (United States)

    Anasagasti, Ander; Barandika, Olatz; Irigoyen, Cristina; Benitez, Bruno A; Cooper, Breanna; Cruchaga, Carlos; López de Munain, Adolfo; Ruiz-Ederra, Javier

    2013-11-01

    Retinitis Pigmentosa (RP) involves a group of genetically determined retinal diseases caused by a large number of mutations that result in rod photoreceptor cell death followed by gradual death of cone cells. Most cases of RP are monogenic, with more than 80 associated genes identified so far. The high number of genes and variants involved in RP, among other factors, is making the molecular characterization of RP a real challenge for many patients. Although HRM has been used for the analysis of isolated variants or single RP genes, as far as we are concerned, this is the first study that uses HRM analysis for a high-throughput screening of several RP genes. Our main goal was to test the suitability of HRM analysis as a genetic screening technique in RP, and to compare its performance with two of the most widely used NGS platforms, Illumina and PGM-Ion Torrent technologies. RP patients (n = 96) were clinically diagnosed at the Ophthalmology Department of Donostia University Hospital, Spain. We analyzed a total of 16 RP genes that meet the following inclusion criteria: 1) size: genes with transcripts of less than 4 kb; 2) number of exons: genes with up to 22 exons; and 3) prevalence: genes reported to account for, at least, 0.4% of total RP cases worldwide. For comparison purposes, RHO gene was also sequenced with Illumina (GAII; Illumina), Ion semiconductor technologies (PGM; Life Technologies) and Sanger sequencing (ABI 3130xl platform; Applied Biosystems). Detected variants were confirmed in all cases by Sanger sequencing and tested for co-segregation in the family of affected probands. We identified a total of 65 genetic variants, 15 of which (23%) were novel, in 49 out of 96 patients. Among them, 14 (4 novel) are probable disease-causing genetic variants in 7 RP genes, affecting 15 patients. Our HRM analysis-based study, proved to be a cost-effective and rapid method that provides an accurate identification of genetic RP variants. This approach is effective for

  11. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    Directory of Open Access Journals (Sweden)

    Husted Søren

    2009-09-01

    Full Text Available Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. Results We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight. A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds, the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm closely matched the total content obtained by analysis of the whole rice grain. Conclusion A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at

  12. Optimizing MRI Logistics: Prospective Analysis of Performance, Efficiency, and Patient Throughput.

    Science.gov (United States)

    Beker, Kevin; Garces-Descovich, Alejandro; Mangosing, Jason; Cabral-Goncalves, Ines; Hallett, Donna; Mortele, Koenraad J

    2017-10-01

    The objective of this study is to optimize MRI logistics through evaluation of MRI workflow and analysis of performance, efficiency, and patient throughput in a tertiary care academic center. For 2 weeks, workflow data from two outpatient MRI scanners were prospectively collected and stratified by value added to the process (i.e., value-added time, business value-added time, or non-value-added time). Two separate time cycles were measured: the actual MRI process cycle as well as the complete length of patient stay in the department. In addition, the impact and frequency of delays across all observations were measured. A total of 305 MRI examinations were evaluated, including body (34.1%), neurologic (28.9%), musculoskeletal (21.0%), and breast examinations (16.1%). The MRI process cycle lasted a mean of 50.97 ± 24.4 (SD) minutes per examination; the mean non-value-added time was 13.21 ± 18.77 minutes (25.87% of the total process cycle time). The mean length-of-stay cycle was 83.51 ± 33.63 minutes; the mean non-value-added time was 24.33 ± 24.84 minutes (29.14% of the total patient stay). The delay with the highest frequency (5.57%) was IV or port placement, which had a mean delay of 22.82 minutes. The delay with the greatest impact on time was MRI arthrography for which joint injection of contrast medium was necessary but was not accounted for in the schedule (mean delay, 42.2 minutes; frequency, 1.64%). Of 305 patients, 34 (11.15%) did not arrive at or before their scheduled time. Non-value-added time represents approximately one-third of the total MRI process cycle and patient length of stay. Identifying specific delays may expedite the application of targeted improvement strategies, potentially increasing revenue, efficiency, and overall patient satisfaction.

  13. Development of high-throughput analysis system using highly-functional organic polymer monoliths

    International Nuclear Information System (INIS)

    Umemura, Tomonari; Kojima, Norihisa; Ueki, Yuji

    2008-01-01

    The growing demand for high-throughput analysis in the current competitive life sciences and industries has promoted the development of high-speed HPLC techniques and tools. As one of such tools, monolithic columns have attracted increasing attention and interest in the last decade due to the low flow-resistance and excellent mass transfer, allowing for rapid separations and reactions at high flow rates with minimal loss of column efficiency. Monolithic materials are classified into two main groups: silica- and organic polymer-based monoliths, each with their own advantages and disadvantages. Organic polymer monoliths have several distinct advantages in life-science research, including wide pH stability, less irreversible adsorption, facile preparation and modification. Thus, we have so far tried to develop organic polymer monoliths for various chemical operations, such as separation, extraction, preconcentration, and reaction. In the present paper, recent progress in the development of organic polymer monoliths is discussed. Especially, the procedure for the preparation of methacrylate-based monoliths with various functional groups is described, where the influence of different compositional and processing parameters on the monolithic structure is also addressed. Furthermore, the performance of the produced monoliths is demonstrated through the results for (1) rapid separations of alklybenzenes at high flow rates, (2) flow-through enzymatic digestion of cytochrome c on a trypsin-immobilized monolithic column, and (3) separation of the tryptic digest on a reversed-phase monolithic column. The flexibility and versatility of organic polymer monoliths will be beneficial for further enhancing analytical performance, and will open the way for new applications and opportunities both in scientific and industrial research. (author)

  14. Identification of microRNAs from Eugenia uniflora by high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Guzman, Frank; Almerão, Mauricio P; Körbes, Ana P; Loss-Morais, Guilherme; Margis, Rogerio

    2012-01-01

    microRNAs or miRNAs are small non-coding regulatory RNAs that play important functions in the regulation of gene expression at the post-transcriptional level by targeting mRNAs for degradation or inhibiting protein translation. Eugenia uniflora is a plant native to tropical America with pharmacological and ecological importance, and there have been no previous studies concerning its gene expression and regulation. To date, no miRNAs have been reported in Myrtaceae species. Small RNA and RNA-seq libraries were constructed to identify miRNAs and pre-miRNAs in Eugenia uniflora. Solexa technology was used to perform high throughput sequencing of the library, and the data obtained were analyzed using bioinformatics tools. From 14,489,131 small RNA clean reads, we obtained 1,852,722 mature miRNA sequences representing 45 conserved families that have been identified in other plant species. Further analysis using contigs assembled from RNA-seq allowed the prediction of secondary structures of 25 known and 17 novel pre-miRNAs. The expression of twenty-seven identified miRNAs was also validated using RT-PCR assays. Potential targets were predicted for the most abundant mature miRNAs in the identified pre-miRNAs based on sequence homology. This study is the first large scale identification of miRNAs and their potential targets from a species of the Myrtaceae family without genomic sequence resources. Our study provides more information about the evolutionary conservation of the regulatory network of miRNAs in plants and highlights species-specific miRNAs.

  15. High-throughput simultaneous analysis of RNA, protein, and lipid biomarkers in heterogeneous tissue samples.

    Science.gov (United States)

    Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S

    2011-11-01

    With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.

  16. Live imaging of muscles in Drosophila metamorphosis: Towards high-throughput gene identification and function analysis.

    Science.gov (United States)

    Puah, Wee Choo; Wasser, Martin

    2016-03-01

    Time-lapse microscopy in developmental biology is an emerging tool for functional genomics. Phenotypic effects of gene perturbations can be studied non-invasively at multiple time points in chronological order. During metamorphosis of Drosophila melanogaster, time-lapse microscopy using fluorescent reporters allows visualization of alternative fates of larval muscles, which are a model for the study of genes related to muscle wasting. While doomed muscles enter hormone-induced programmed cell death, a smaller population of persistent muscles survives to adulthood and undergoes morphological remodeling that involves atrophy in early, and hypertrophy in late pupation. We developed a method that combines in vivo imaging, targeted gene perturbation and image analysis to identify and characterize genes involved in muscle development. Macrozoom microscopy helps to screen for interesting muscle phenotypes, while confocal microscopy in multiple locations over 4-5 days produces time-lapse images that are used to quantify changes in cell morphology. Performing a similar investigation using fixed pupal tissues would be too time-consuming and therefore impractical. We describe three applications of our pipeline. First, we show how quantitative microscopy can track and measure morphological changes of muscle throughout metamorphosis and analyze genes involved in atrophy. Second, our assay can help to identify genes that either promote or prevent histolysis of abdominal muscles. Third, we apply our approach to test new fluorescent proteins as live markers for muscle development. We describe mKO2 tagged Cysteine proteinase 1 (Cp1) and Troponin-I (TnI) as examples of proteins showing developmental changes in subcellular localization. Finally, we discuss strategies to improve throughput of our pipeline to permit genome-wide screens in the future. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. A novel quantitative approach for eliminating sample-to-sample variation using a hue saturation value analysis program.

    Science.gov (United States)

    Yabusaki, Katsumi; Faits, Tyler; McMullen, Eri; Figueiredo, Jose Luiz; Aikawa, Masanori; Aikawa, Elena

    2014-01-01

    As computing technology and image analysis techniques have advanced, the practice of histology has grown from a purely qualitative method to one that is highly quantified. Current image analysis software is imprecise and prone to wide variation due to common artifacts and histological limitations. In order to minimize the impact of these artifacts, a more robust method for quantitative image analysis is required. Here we present a novel image analysis software, based on the hue saturation value color space, to be applied to a wide variety of histological stains and tissue types. By using hue, saturation, and value variables instead of the more common red, green, and blue variables, our software offers some distinct advantages over other commercially available programs. We tested the program by analyzing several common histological stains, performed on tissue sections that ranged from 4 µm to 10 µm in thickness, using both a red green blue color space and a hue saturation value color space. We demonstrated that our new software is a simple method for quantitative analysis of histological sections, which is highly robust to variations in section thickness, sectioning artifacts, and stain quality, eliminating sample-to-sample variation.

  18. Transient performances analysis of wind turbine system with induction generator including flux saturation and skin effect

    DEFF Research Database (Denmark)

    Li, H.; Zhao, B.; Han, L.

    2010-01-01

    In order to analyze correctly the effect of different models for induction generators on the transient performances of large wind power generation, Wind turbine driven squirrel cage induction generator (SCIG) models taking into account both main and leakage flux saturation and skin effect were...

  19. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  20. Acquisition and analysis of throughput rates for an operational department-wide PACS

    Science.gov (United States)

    Stewart, Brent K.; Taira, Ricky K.; Dwyer, Samuel J., III; Huang, H. K.

    1992-07-01

    The accurate prediction of image throughput is a critical issue in planning for and acquisition of any successful Picture Archiving and Communication System (PACS). Bottlenecks or design flaws can render an expensive PACS implementation useless. This manuscript presents a method for accurately predicting and measuring image throughput of a PACS design. To create the simulation model of the planned or implemented PACS, it must first be decomposed into principal tasks. We have decomposed the entire PACS image management chain into eight subsystems. These subsystems include network transfers over three different networks (Ethernet, FDDI and UltraNet) and five software programs and/or queues: (1) transfer of image data from the imaging modality computer to the image acquisition/reformatting computer; (2) reformatting the image data into a standard image format; (3) transferring the image data from the acquisition/reformatting computer to the image archive computer; (4) updating a relational database management system over the network; (5) image processing-- rotation and optimal gray-scale lookup table calculation; (6) request that the image be archived; (7) image transfer from the image archive computer to a designated image display workstation; and (8) update the local database on the image display station, separate the image header from the image data and store the image data on a parallel disk array. Through development of an event logging facility and implementation of a network management package we have acquired throughput data for each subsystem in the PACS chain. In addition, from our PACS relational database management system, we have distilled the traffic generation patterns (temporal, file size and destination) of our imaging modality devices. This data has been input into a simulation modeling package (Block Oriented Network Simulator-- BONeS) to estimate the characteristics of the modeled PACS, e.g., the throughput rates and delay time. This simulation

  1. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  2. An Analysis of Saturated Film Boiling Heat Transfer from a Vertical Slab with Horizontal Bottom Surface

    OpenAIRE

    茂地, 徹; 山田, たかし

    1997-01-01

    The film boiling heat transfer from a vertical slab with horizontal bottom surface to saturated liquids was analyzed theoretically. Bromley's solution for the vertical surface was modified to accommodate the continuity of the vapor mass flow rate around the lower corner of the vertical slab. The thickness of the vapor film covering the vertical surface of the slab was increased owing to the inflow of vapor generated under the horizontal bottom surface and resulted in a decrease in the heat tr...

  3. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  4. Fairness analysis of throughput and delay in WLAN environments with channel diversities

    Directory of Open Access Journals (Sweden)

    Fang Shih-Hau

    2011-01-01

    Full Text Available Abstract The article investigates fairness in terms of throughput and packet delays among users with diverse channel conditions due to the mobility and fading effects in IEEE 802.11 WLAN (wireless local area networks environments. From our analytical results, it is shown that 802.11 CSMA/CA can present fairness among hosts with identical link qualities regardless of equal or different data rates applied. Our analytical results further demonstrate that the presence of diverse channel conditions can pose significant unfairness on both throughput and packet delays even with a link adaptation mechanism since the MCSs (modulation and coding schemes available are limited. The simulation results validate the accuracy of our analytical model.

  5. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Craig A Gedye

    Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell

  6. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Science.gov (United States)

    Gedye, Craig A; Hussain, Ali; Paterson, Joshua; Smrke, Alannah; Saini, Harleen; Sirskyj, Danylo; Pereira, Keira; Lobo, Nazleen; Stewart, Jocelyn; Go, Christopher; Ho, Jenny; Medrano, Mauricio; Hyatt, Elzbieta; Yuan, Julie; Lauriault, Stevan; Meyer, Mona; Kondratyev, Maria; van den Beucken, Twan; Jewett, Michael; Dirks, Peter; Guidos, Cynthia J; Danska, Jayne; Wang, Jean; Wouters, Bradly; Neel, Benjamin; Rottapel, Robert; Ailles, Laurie E

    2014-01-01

    Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC) platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell markers.

  7. A sensitivity analysis on seismic tomography data with respect to CO2 saturation of a CO2 geological sequestration field

    Science.gov (United States)

    Park, Chanho; Nguyen, Phung K. T.; Nam, Myung Jin; Kim, Jongwook

    2013-04-01

    Monitoring CO2 migration and storage in geological formations is important not only for the stability of geological sequestration of CO2 but also for efficient management of CO2 injection. Especially, geophysical methods can make in situ observation of CO2 to assess the potential leakage of CO2 and to improve reservoir description as well to monitor development of geologic discontinuity (i.e., fault, crack, joint, etc.). Geophysical monitoring can be based on wireline logging or surface surveys for well-scale monitoring (high resolution and nallow area of investigation) or basin-scale monitoring (low resolution and wide area of investigation). In the meantime, crosswell tomography can make reservoir-scale monitoring to bridge the resolution gap between well logs and surface measurements. This study focuses on reservoir-scale monitoring based on crosswell seismic tomography aiming describe details of reservoir structure and monitoring migration of reservoir fluid (water and CO2). For the monitoring, we first make a sensitivity analysis on crosswell seismic tomography data with respect to CO2 saturation. For the sensitivity analysis, Rock Physics Models (RPMs) are constructed by calculating the values of density and P and S-wave velocities of a virtual CO2 injection reservoir. Since the seismic velocity of the reservoir accordingly changes as CO2 saturation changes when the CO2 saturation is less than about 20%, while when the CO2 saturation is larger than 20%, the seismic velocity is insensitive to the change, sensitivity analysis is mainly made when CO2 saturation is less than 20%. For precise simulation of seismic tomography responses for constructed RPMs, we developed a time-domain 2D elastic modeling based on finite difference method with a staggered grid employing a boundary condition of a convolutional perfectly matched layer. We further make comparison between sensitivities of seismic tomography and surface measurements for RPMs to analysis resolution

  8. Thermo-economic analysis of recuperated Maisotsenko bottoming cycle using triplex air saturator: Comparative analyses

    International Nuclear Information System (INIS)

    Saghafifar, Mohammad; Omar, Amr; Erfanmoghaddam, Sepehr; Gadalla, Mohamed

    2017-01-01

    Highlights: • Proposing recuperated Maisotsenko bottoming cycle (RMBC) as a new combined cycle. • Introducing triplex air saturator for waste heat recovery application. • Conducting thermodynamic optimization to maximize RMBC thermal efficiency. • Conducting thermo-economic optimization to minimize RMBC cost of electricity. - Abstract: A recently recommended combined cycle power plant is to employ another gas turbine cycle for waste heat recovery as an air bottoming cycle (ABC). There are some studies conducted to improve ABC’s thermodynamic performance utilizing commonly power augmentation methods such as steam/water injection. In particular, it is proposed to employ Maisotsenko gas turbine cycle as a bottoming cycle, i.e. Maisotsenko bottoming cycle (MBC). Due to the promising performance of the MBC configuration, it is decided to investigate a recuperated MBC (RMBC) configuration by recommending the triplex air saturator. In this way, the air saturator consists of three sections. The first section is an indirect evaporative cooler while the other two sections are responsible for heat recovery from the topping and bottoming cycle turbines exhaust. In this paper, thermodynamic and thermo-economic analyses are carried out to study the main merits and demerits of RMBC against MBC configuration. Thermodynamic optimization results indicate that the maximum achievable efficiency for MBC and RMBC incorporation in a simple gas turbine power plant are 39.40% and 44.73%, respectively. Finally, thermo-economic optimization shows that the optimum levelized cost of electricity for MBC and RMBC power plants are 62.922 US$/MWh and 58.154 US$/MWh, respectively.

  9. Numerical modeling for saturated-zone groundwater travel time analysis at Yucca Mountain

    International Nuclear Information System (INIS)

    Arnold, B.W.; Barr, G.E.

    1996-01-01

    A three-dimensional, site-scale numerical model of groundwater flow in the saturated zone at Yucca Mountain was constructed and linked to particle tracking simulations to produce an estimate of the distribution of groundwater travel times from the potential repository to the boundary of the accessible environment. This effort and associated modeling of groundwater travel times in the unsaturated zone were undertaken to aid in the evaluation of compliance of the site with 10CFR960. These regulations stipulate that pre-waste-emplacement groundwater travel time to the accessible environment shall exceed 1,000 years along any path of likely and significant radionuclide travel

  10. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  11. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs.

    Science.gov (United States)

    Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo

    2008-01-23

    To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  12. Investigating Functional Extension of Optical Coherence Tomography for Spectroscopic Analysis of Blood Oxygen Saturation

    Science.gov (United States)

    Chen, Siyu

    Over the past two decades, optical coherence tomography (OCT) has been successfully applied to various fields of biomedical researching and clinical studies, including cardiology, urology, dermatology, dentistry, oncology, and most successfully, ophthalmology. This dissertation seeks to extend the current OCT practice, which is still largely morphology-based, into a new dimension, functional analysis of metabolic activities in vivo. More specifically, the investigation is focused on retrieving blood oxygen saturation (sO2) using intrinsic hemoglobin optical absorption contrast. Most mammalian cells rely on aerobic respiration to support cellular function, which means they consume oxygen to create adenosine triphosphate (ATP). Metabolic rate of oxygen (MRO2), a key hemodynamic parameter, characterizes how much oxygen is consumed during a given period of time, reflecting the metabolic activity of the target tissue. For example, retinal neurons are highly active and almost entirely rely on the moment-to-moment oxygen supply from retinal circulations. Thus, variation in MRO2 reveals the instantaneous activity of these neurons, shedding light on the physiological and pathophysiological change of cellular functions. Eventually, measuring MRO2 can potentially provide a biomarker for early-stage disease diagnosis, and serve as one benchmark for evaluating effectiveness of medical intervention during disease management. Essential in calculating MRO2, blood sO2 measurements using spectroscopic OCT analysis has been attempted as early as 2003. OCT is intrinsically sensitive to the blood optical absorption spectrum due to its wide-band illumination and detection scheme relying on back-scattered photon. However, accurate retrieval of blood sO2 using conventional near infrared (NIR) OCT systems in vivo has remained challenging. It was not until the development of OCT systems using visible light illumination (vis-OCT) when accurate measurement of blood sO2 was reported in live

  13. Analysis of nitrogen saturation potential in Rocky Mountain tundra and forest: implications for aquatic systems

    Science.gov (United States)

    Baron, Jill S.; Ojima, Dennis S.; Holland, Elisabeth A.; Parton, William J.

    1994-01-01

    We employed grass and forest versions of the CENTURY model under a range of N deposition values (0.02–1.60 g N m−2 y−1) to explore the possibility that high observed lake and stream N was due to terrestrial N saturation of alpine tundra and subalpine forest in Loch Vale Watershed, Rocky Mountain National Park, Colorado. Model results suggest that N is limiting to subalpine forest productivity, but that excess leachate from alpine tundra is sufficient to account for the current observed stream N. Tundra leachate, combined with N leached from exposed rock surfaces, produce high N loads in aquatic ecosystems above treeline in the Colorado Front Range. A combination of terrestrial leaching, large N inputs from snowmelt, high watershed gradients, rapid hydrologic flushing and lake turnover times, and possibly other nutrient limitations of aquatic organisms constrain high elevation lakes and streams from assimilating even small increases in atmospheric N. CENTURY model simulations further suggest that, while increased N deposition will worsen the situation, nitrogen saturation is an ongoing phenomenon.

  14. Analysis of a microscale 'Saturation Phase-change Internal Carnot Engine'

    Energy Technology Data Exchange (ETDEWEB)

    Lurie, Eli [School of Mechanical Engineering, Tel Aviv University, Tel Aviv 69978 (Israel); Kribus, Abraham, E-mail: kribus@eng.tau.ac.i [School of Mechanical Engineering, Tel Aviv University, Tel Aviv 69978 (Israel)

    2010-06-15

    A micro heat engine, based on a cavity filled with a stationary working fluid under liquid-vapor saturation conditions and encapsulated by two membranes, is described and analyzed. This engine design is easy to produce using MEMS technologies and is operated with external heating and cooling. The motion of the membranes is controlled such that the internal pressure and temperature are constant during the heat addition and removal processes, and thus the fluid executes a true internal Carnot cycle. A model of this Saturation Phase-change Internal Carnot Engine (SPICE) was developed including thermodynamic, mechanical and heat transfer aspects. The efficiency and maximum power of the engine are derived. The maximum power point is fixed in a three-parameter space, and operation at this point leads to maximum power density that scales with the inverse square of the engine dimension. Inclusion of the finite heat capacity of the engine wall leads to a strong dependence of performance on engine frequency, and the existence of an optimal frequency. Effects of transient reverse heat flow, and 'parasitic heat' that does not participate in the thermodynamic cycle are observed.

  15. Analysis of a microscale 'Saturation Phase-change Internal Carnot Engine'

    International Nuclear Information System (INIS)

    Lurie, Eli; Kribus, Abraham

    2010-01-01

    A micro heat engine, based on a cavity filled with a stationary working fluid under liquid-vapor saturation conditions and encapsulated by two membranes, is described and analyzed. This engine design is easy to produce using MEMS technologies and is operated with external heating and cooling. The motion of the membranes is controlled such that the internal pressure and temperature are constant during the heat addition and removal processes, and thus the fluid executes a true internal Carnot cycle. A model of this Saturation Phase-change Internal Carnot Engine (SPICE) was developed including thermodynamic, mechanical and heat transfer aspects. The efficiency and maximum power of the engine are derived. The maximum power point is fixed in a three-parameter space, and operation at this point leads to maximum power density that scales with the inverse square of the engine dimension. Inclusion of the finite heat capacity of the engine wall leads to a strong dependence of performance on engine frequency, and the existence of an optimal frequency. Effects of transient reverse heat flow, and 'parasitic heat' that does not participate in the thermodynamic cycle are observed.

  16. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    Science.gov (United States)

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  17. Detection of Static Eccentricity Fault in Saturated Induction Motors by Air-Gap Magnetic Flux Signature Analysis Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    N. Halem

    2013-06-01

    Full Text Available Unfortunately, motor current signature analysis (MCSA cannot detect the small degrees of the purely static eccentricity (SE defects, while the air-gap magnetic flux signature analysis (FSA is applied successfully. The simulation results are obtained by using time stepping finite elements (TSFE method. In order to show the impact of magnetic saturation upon the diagnosis of SE fault, the analysis is carried out for saturated induction motors. The index signatures of static eccentricity fault around fundamental and PSHs are detected successfully for saturated motor.

  18. DETECTION OF STATIC ECCENTRICITY FAULT IN SATURATED INDUCTION MOTORS BY AIR-GAP MAGNETIC FLUX SIGNATURE ANALYSIS USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    N. Halem

    2013-06-01

    Full Text Available Unfortunately, motor current signature analysis (MCSA cannot detect the small degrees of the purely static eccentricity (SE defects, while the air-gap magnetic flux signature analysis (FSA is applied successfully. The simulation results are obtained by using time stepping finite elements (TSFE method. In order to show the impact of magnetic saturation upon the diagnosis of SE fault, the analysis is carried out for saturated induction motors. The index signatures of static eccentricity fault around fundamental and PSHs are detected successfully for saturated motor.

  19. DETECTION OF STATIC ECCENTRICITY FAULT IN SATURATED INDUCTION MOTORS BY AIR-GAP MAGNETIC FLUX SIGNATURE ANALYSIS USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    N. Halem

    2015-07-01

    Full Text Available Unfortunately, motor current signature analysis (MCSA cannot detect the small degrees of the purely static eccentricity (SE defects, while the air-gap magnetic flux signature analysis (FSA is applied successfully. The simulation results are obtained by using time stepping finite elements (TSFE method. In order to show the impact of magnetic saturation upon the diagnosis of SE fault, the analysis is carried out for saturated induction motors. The index signatures of static eccentricity fault around fundamental and PSHs are detected successfully for saturated motor.

  20. High-throughput metabolic state analysis: The missing link in integrated functional genomics of yeasts

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Moxley, Joel. F; Åkesson, Mats Fredrik

    2005-01-01

    that achieve comparable throughput, effort and cost compared with DNA arrays. Our sample workup method enables simultaneous metabolite measurements throughout central carbon metabolism and amino acid biosynthesis, using a standard GC-MS platform that was optimized for this Purpose. As an implementation proof......-of-concept, we assayed metabolite levels in two yeast strains and two different environmental conditions in the context of metabolic pathway reconstruction. We demonstrate that these differential metabolite level data distinguish among sample types, such as typical metabolic fingerprinting or footprinting. More...

  1. Worst-case Throughput Analysis for Parametric Rate and Parametric Actor Execution Time Scenario-Aware Dataflow Graphs

    Directory of Open Access Journals (Sweden)

    Mladen Skelin

    2014-03-01

    Full Text Available Scenario-aware dataflow (SADF is a prominent tool for modeling and analysis of dynamic embedded dataflow applications. In SADF the application is represented as a finite collection of synchronous dataflow (SDF graphs, each of which represents one possible application behaviour or scenario. A finite state machine (FSM specifies the possible orders of scenario occurrences. The SADF model renders the tightest possible performance guarantees, but is limited by its finiteness. This means that from a practical point of view, it can only handle dynamic dataflow applications that are characterized by a reasonably sized set of possible behaviours or scenarios. In this paper we remove this limitation for a class of SADF graphs by means of SADF model parametrization in terms of graph port rates and actor execution times. First, we formally define the semantics of the model relevant for throughput analysis based on (max,+ linear system theory and (max,+ automata. Second, by generalizing some of the existing results, we give the algorithms for worst-case throughput analysis of parametric rate and parametric actor execution time acyclic SADF graphs with a fully connected, possibly infinite state transition system. Third, we demonstrate our approach on a few realistic applications from digital signal processing (DSP domain mapped onto an embedded multi-processor architecture.

  2. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  3. Estimating saturated hydraulic conductivity and air permeability from soil physical properties using state-space analysis

    DEFF Research Database (Denmark)

    Poulsen, Tjalfe; Møldrup, Per; Nielsen, Don

    2003-01-01

    and gaseous chemicals in the vadose zone. In this study, three modeling approaches were used to identify the dependence of saturated hydraulic conductivity (K-S) and air permeability at -100 cm H2O soil-water potential (k(a100)) on soil physical properties in undisturbed soil: (i) Multiple regression, (ii......) ARIMA (autoregressive integrated moving average) modeling, and (iii) State-space modeling. In addition to actual soil property values, ARIMA and state-space models account for effects of spatial correlation in soil properties. Measured data along two 70-m-long transects at a 20-year old constructed......Estimates of soil hydraulic conductivity (K) and air permeability (k(a)) at given soil-water potentials are often used as reference points in constitutive models for K and k(a) as functions of moisture content and are, therefore, a prerequisite for predicting migration of water, air, and dissolved...

  4. Quantitative Analysis of L-Edge White Line Intensities: The Influence of Saturation and Transverse Coherence

    International Nuclear Information System (INIS)

    Hahlin, A.

    2001-01-01

    We have performed x-ray absorption spectroscopy at the Fe, Ni, and Co L2,3 edges of in situ grown thin magnetic films. We compare electron yield measurements performed at SSRL and BESSY-I. Differences in the L2,3 white line intensities are found for all three elements, comparing data from the two facilities. We propose a correlation between spectral intensities and the degree of spatial coherence of the exciting radiation. The electron yield saturation effects are stronger for light with a higher degree of spatial coherence. Therefore the observed, coherence related, intensity variations are due to an increase in the absorption coefficient, and not to secondary channel related effects

  5. Quantitative analysis of L-edge white line intensities: the influence of saturation and transverse coherence.

    Science.gov (United States)

    Hahlin, A; Karis, O; Brena, B; Dunn, J H; Arvantis, D

    2001-03-01

    We have performed x-ray absorption spectroscopy at the Fe, Ni, and Co L2,3 edges of in situ grown thin magnetic films. We compare electron yield measurements performed at SSRL and BESSY-I. Differences in the L2,3 white line intensities are found for all three elements, comparing data from the two facilities. We propose a correlation between spectral intensities and the degree of spatial coherence of the exciting radiation. The electron yield saturation effects are stronger for light with a higher degree of spatial coherence. Therefore the observed, coherence related, intensity variations are due to an increase in the absorption coefficient, and not to secondary channel related effects.

  6. Calibration of a neutron log in partially saturated media. Part II. Error analysis

    International Nuclear Information System (INIS)

    Hearst, J.R.; Kasameyer, P.W.; Dreiling, L.A.

    1981-01-01

    Four sources or error (uncertainty) are studied in water content obtained from neutron logs calibrated in partially saturated media for holes up to 3 m. For this calibration a special facility was built and an algorithm for a commercial epithermal neutron log was developed that obtains water content from count rate, bulk density, and gap between the neutron sonde and the borehole wall. The algorithm contained errors due to the calibration and lack of fit, while the field measurements included uncertainties in the count rate (caused by statistics and a short time constant), gap, and density. There can be inhomogeneity in the material surrounding the borehole. Under normal field conditions the hole-size-corrected water content obtained from such neutron logs can have an uncertainty as large as 15% of its value

  7. Low-frequency asymptotic analysis of seismic reflection from afluid-saturated medium

    Energy Technology Data Exchange (ETDEWEB)

    Silin, D.B.; Korneev, V.A.; Goloshubin, G.M.; Patzek, T.W.

    2004-04-14

    Reflection of a seismic wave from a plane interface betweentwo elastic media does not depend on the frequency. If one of the mediais poroelastic and fluid-saturated, then the reflection becomesfrequency-dependent. This paper presents a low-frequency asymptoticformula for the reflection of seismic plane p-wave from a fluid-saturatedporous medium. The obtained asymptotic scaling of the frequency-dependentcomponent of the reflection coefficient shows that it is asymptoticallyproportional to the square root of the product of the reservoir fluidmobility and the frequency of the signal. The dependence of this scalingon the dynamic Darcy's law relaxation time is investigated as well.Derivation of the main equations of the theory of poroelasticity from thedynamic filtration theory reveals that this relaxation time isproportional to Biot's tortuosity parameter.

  8. Saturation of THz detection in InGaAs-based HEMTs: a numerical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mahi, A. [Centre Universitaire Nour Bachir, B.P. 900, 32000 El Bayadh (Algeria); Palermo, C., E-mail: christophe.palermo@umontpellier.fr [University of Montpellier, IES, UMR 5214, 34000 Montpellier (France); CNRS, IES, UMR 5214, 34000 Montpellier (France); Marinchio, H. [University of Montpellier, IES, UMR 5214, 34000 Montpellier (France); CNRS, IES, UMR 5214, 34000 Montpellier (France); Belgachi, A. [University of Bechar, Bechar 08000 (Algeria); Varani, L. [University of Montpellier, IES, UMR 5214, 34000 Montpellier (France); CNRS, IES, UMR 5214, 34000 Montpellier (France)

    2016-11-01

    By numerical simulations, we investigate the large-signal photoresponse of InGaAs high electron mobility transistors submitted to THz radiations. The used pseudo-2D hydrodynamic model considers electron density and velocity conservations equations. A third equation is solved, in order to describe average energy conservation or to maintain it constantly equal to its thermal equilibrium value. In both cases, the calculated photoresponse increases with the incoming power density for its smallest values. For the higher values, a saturation of the photoresponse is observed, in agreement with experimental results, only when the energy conservation is accounted for. This allows to relate the limitation of the transistor detection features to electron heating phenomenon.

  9. Stability and bifurcation analysis of an SIR epidemic model with logistic growth and saturated treatment

    International Nuclear Information System (INIS)

    Li, Jinhui; Teng, Zhidong; Wang, Guangqing; Zhang, Long; Hu, Cheng

    2017-01-01

    In this paper, we introduce the saturated treatment and logistic growth rate into an SIR epidemic model with bilinear incidence. The treatment function is assumed to be a continuously differential function which describes the effect of delayed treatment when the medical condition is limited and the number of infected individuals is large enough. Sufficient conditions for the existence and local stability of the disease-free and positive equilibria are established. And the existence of the stable limit cycles also is obtained. Moreover, by using the theory of bifurcations, it is shown that the model exhibits backward bifurcation, Hopf bifurcation and Bogdanov–Takens bifurcations. Finally, the numerical examples are given to illustrate the theoretical results and obtain some additional interesting phenomena, involving double stable periodic solutions and stable limit cycles.

  10. High throughput proteomic analysis of the secretome in an explant model of articular cartilage inflammation

    Science.gov (United States)

    Clutterbuck, Abigail L.; Smith, Julia R.; Allaway, David; Harris, Pat; Liddell, Susan; Mobasheri, Ali

    2011-01-01

    This study employed a targeted high-throughput proteomic approach to identify the major proteins present in the secretome of articular cartilage. Explants from equine metacarpophalangeal joints were incubated alone or with interleukin-1beta (IL-1β, 10 ng/ml), with or without carprofen, a non-steroidal anti-inflammatory drug, for six days. After tryptic digestion of culture medium supernatants, resulting peptides were separated by HPLC and detected in a Bruker amaZon ion trap instrument. The five most abundant peptides in each MS scan were fragmented and the fragmentation patterns compared to mammalian entries in the Swiss-Prot database, using the Mascot search engine. Tryptic peptides originating from aggrecan core protein, cartilage oligomeric matrix protein (COMP), fibronectin, fibromodulin, thrombospondin-1 (TSP-1), clusterin (CLU), cartilage intermediate layer protein-1 (CILP-1), chondroadherin (CHAD) and matrix metalloproteinases MMP-1 and MMP-3 were detected. Quantitative western blotting confirmed the presence of CILP-1, CLU, MMP-1, MMP-3 and TSP-1. Treatment with IL-1β increased MMP-1, MMP-3 and TSP-1 and decreased the CLU precursor but did not affect CILP-1 and CLU levels. Many of the proteins identified have well-established extracellular matrix functions and are involved in early repair/stress responses in cartilage. This high throughput approach may be used to study the changes that occur in the early stages of osteoarthritis. PMID:21354348

  11. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    International Nuclear Information System (INIS)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2014-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication

  12. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Directory of Open Access Journals (Sweden)

    Rok Gaber

    2013-11-01

    Full Text Available To effectively fight against the human immunodeficiency virus infection/ acquired immunodeficiency syndrome (HIV/AIDS epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity.

  13. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Science.gov (United States)

    Gaber, Rok; Majerle, Andreja; Jerala, Roman; Benčina, Mojca

    2013-01-01

    To effectively fight against the human immunodeficiency virus infection/acquired immunodeficiency syndrome (HIV/AIDS) epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET)-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity. PMID:24287545

  14. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States); Gagnon, David [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Gjoerup, Ole [Molecular Oncology Research Institute, Tufts Medical Center, Boston, MA 02111 (United States); Archambault, Jacques [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Bullock, Peter A., E-mail: Peter.Bullock@tufts.edu [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States)

    2014-11-15

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication.

  15. Time-motion analysis of factors affecting patient throughput in an MR imaging center

    International Nuclear Information System (INIS)

    O'Donohue, J.; Enzmann, D.R.

    1986-01-01

    The high cost of MR imaging makes efficient use essential. In an effort to increase patient throughput, attention has been focused on shortening the imaging time through reductions in matrix size and number of excitations, and through the use of newer ''fast imaging'' techniques. Less attention has been given to other time-consuming aspects not directly related to imaging time. The authors undertook a time-motion study using a daily log of minute-by-minute activities associated with an MR imaging examination. The times required for the following components of the examination were measured: total study time, examination set-up time, intrastudy physician ''image review'' time, and interstudy patient turnover time. The time lost to claustrophobic reactions, patients' failure to appear for scheduled examinations, unanticipated patient care (sedation, reassurance), and equipment malfunction was also analyzed. Actual imaging time accounted for a relatively small proportion (42%) of total study time. Other factors such as intrastudy image review time (15%), interstudy patient turnover time (11%), and time lost due to claustrophobic reactions, patients' failure to appear for scheduled examinations, and equipment malfunction contributed significantly to the total study time. Simple solutions to these problems can contribute greatly to increasing patient throughput

  16. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  17. The simple fool's guide to population genomics via RNA-Seq: An introduction to high-throughput sequencing data analysis

    DEFF Research Database (Denmark)

    De Wit, P.; Pespeni, M.H.; Ladner, J.T.

    2012-01-01

    to Population Genomics via RNA-seq' (SFG), a document intended to serve as an easy-to-follow protocol, walking a user through one example of high-throughput sequencing data analysis of nonmodel organisms. It is by no means an exhaustive protocol, but rather serves as an introduction to the bioinformatic methods...... used in population genomics, enabling a user to gain familiarity with basic analysis steps. The SFG consists of two parts. This document summarizes the steps needed and lays out the basic themes for each and a simple approach to follow. The second document is the full SFG, publicly available at http://sfg.......stanford.edu, that includes detailed protocols for data processing and analysis, along with a repository of custom-made scripts and sample files. Steps included in the SFG range from tissue collection to de novo assembly, blast annotation, alignment, gene expression, functional enrichment, SNP detection, principal components...

  18. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  19. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  20. 3D-SURFER: software for high-throughput protein surface comparison and analysis.

    Science.gov (United States)

    La, David; Esquivel-Rodríguez, Juan; Venkatraman, Vishwesh; Li, Bin; Sael, Lee; Ueng, Stephen; Ahrendt, Steven; Kihara, Daisuke

    2009-11-01

    We present 3D-SURFER, a web-based tool designed to facilitate high-throughput comparison and characterization of proteins based on their surface shape. As each protein is effectively represented by a vector of 3D Zernike descriptors, comparison times for a query protein against the entire PDB take, on an average, only a couple of seconds. The web interface has been designed to be as interactive as possible with displays showing animated protein rotations, CATH codes and structural alignments using the CE program. In addition, geometrically interesting local features of the protein surface, such as pockets that often correspond to ligand binding sites as well as protrusions and flat regions can also be identified and visualized. 3D-SURFER is a web application that can be freely accessed from: http://dragon.bio.purdue.edu/3d-surfer dkihara@purdue.edu Supplementary data are available at Bioinformatics online.

  1. Cancer panomics: computational methods and infrastructure for integrative analysis of cancer high-throughput "omics" data

    DEFF Research Database (Denmark)

    Brunak, Søren; De La Vega, Francisco M.; Rätsch, Gunnar

    2014-01-01

    Targeted cancer treatment is becoming the goal of newly developed oncology medicines and has already shown promise in some spectacular cases such as the case of BRAF kinase inhibitors in BRAF-mutant (e.g. V600E) melanoma. These developments are driven by the advent of high-throughput sequencing......, which continues to drop in cost, and that has enabled the sequencing of the genome, transcriptome, and epigenome of the tumors of a large number of cancer patients in order to discover the molecular aberrations that drive the oncogenesis of several types of cancer. Applying these technologies...... in the clinic promises to transform cancer treatment by identifying therapeutic vulnerabilities of each patient's tumor. These approaches will need to address the panomics of cancer--the integration of the complex combination of patient-specific characteristics that drive the development of each person's tumor...

  2. High-throughput liquid chromatography for drug analysis in biological fluids: investigation of extraction column life.

    Science.gov (United States)

    Zeng, Wei; Fisher, Alison L; Musson, Donald G; Wang, Amy Qiu

    2004-07-05

    A novel method was developed and assessed to extend the lifetime of extraction columns of high-throughput liquid chromatography (HTLC) for bioanalysis of human plasma samples. In this method, a 15% acetic acid solution and 90% THF were respectively used as mobile phases to clean up the proteins in human plasma samples and residual lipids from the extraction and analytical columns. The 15% acetic acid solution weakens the interactions between proteins and the stationary phase of the extraction column and increases the protein solubility in the mobile phase. The 90% THF mobile phase prevents the accumulation of lipids and thus reduces the potential damage on the columns. Using this novel method, the extraction column lifetime has been extended to about 2000 direct plasma injections, and this is the first time that high concentration acetic acid and THF are used in HTLC for on-line cleanup and extraction column lifetime extension.

  3. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    Science.gov (United States)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  4. The use of FTA cards for preserving unfixed cytological material for high-throughput molecular analysis.

    Science.gov (United States)

    Saieg, Mauro Ajaj; Geddie, William R; Boerner, Scott L; Liu, Ni; Tsao, Ming; Zhang, Tong; Kamel-Reid, Suzanne; da Cunha Santos, Gilda

    2012-06-25

    Novel high-throughput molecular technologies have made the collection and storage of cells and small tissue specimens a critical issue. The FTA card provides an alternative to cryopreservation for biobanking fresh unfixed cells. The current study compared the quality and integrity of the DNA obtained from 2 types of FTA cards (Classic and Elute) using 2 different extraction protocols ("Classic" and "Elute") and assessed the feasibility of performing multiplex mutational screening using fine-needle aspiration (FNA) biopsy samples. Residual material from 42 FNA biopsies was collected in the cards (21 Classic and 21 Elute cards). DNA was extracted using the Classic protocol for Classic cards and both protocols for Elute cards. Polymerase chain reaction for p53 (1.5 kilobase) and CARD11 (500 base pair) was performed to assess DNA integrity. Successful p53 amplification was achieved in 95.2% of the samples from the Classic cards and in 80.9% of the samples from the Elute cards using the Classic protocol and 28.5% using the Elute protocol (P = .001). All samples (both cards) could be amplified for CARD11. There was no significant difference in the DNA concentration or 260/280 purity ratio when the 2 types of cards were compared. Five samples were also successfully analyzed by multiplex MassARRAY spectrometry, with a mutation in KRAS found in 1 case. High molecular weight DNA was extracted from the cards in sufficient amounts and quality to perform high-throughput multiplex mutation assays. The results of the current study also suggest that FTA Classic cards preserve better DNA integrity for molecular applications compared with the FTA Elute cards. Copyright © 2012 American Cancer Society.

  5. Investigating the Factors Affecting theZahedan’s Aquifer Hydrogeochemistry Using Foctor Analysis, Saturation Indices and Composite Diagrams’ Methods

    Directory of Open Access Journals (Sweden)

    J. Dowlati

    2014-12-01

    Full Text Available Zahedan aquifer is located in the northernof Zahedanwatedshed. It is essential to evaluate the quality of groundwater resources due to proving some part of drinking water, agricultural and industrial waters of this city. In order to carry out ground water quality monitoring, and assess the controlling possesses and determine cations and anions sources of the groundwater, 26 wells were sampled and water quality parameters were measured.The results of the analysis showed that almost all of the samples proved very saline and electrical conductivity varied from 1,359 to 12,620μS cm−1. In the Zahedan aquifer, sodium, chloride and sulfate were predominant Cation and Anions respectively, and sodium-chloride Na-Cl( and sodium - sulfateNa-So4 were dominant types of the groundwater. The factor analysis of samples results indicates that the two natural and human factors controlled about the 83/30% and 74/37% of the quality variations of the groundwater respectively in October and February. The first and major factor related to the natural processes of ion exchange and dissolution had a correlation with positive loadings of EC, Ca2+, Mg2+, Na+, Cl-, K+ and So42- and controls the 65.25% of the quality variations of the ground water in October and the 58.82% in February. The second factor related toCa2+, No3- constituted the18.05% of the quality variations in October and 15.56% in February, and given the urban development and less agricultural development in the aquifer, is dependent on human activities. For the samples collected in October, the saturation indices of calcite, gypsum and dolomite minerals showed saturated condition and calcite and dolomite in February showed saturated condition for more than 60% and 90% of samples and gypsum index revealed under-saturated condition for almost all samples.The unsaturated condition of Zahedan groundwater aquifer is resulted from the insufficient time for retaining water in the aquifer to dissolve the minerals

  6. An RNA-Based Fluorescent Biosensor for High-Throughput Analysis of the cGAS-cGAMP-STING Pathway.

    Science.gov (United States)

    Bose, Debojit; Su, Yichi; Marcus, Assaf; Raulet, David H; Hammond, Ming C

    2016-12-22

    In mammalian cells, the second messenger (2'-5',3'-5') cyclic guanosine monophosphate-adenosine monophosphate (2',3'-cGAMP), is produced by the cytosolic DNA sensor cGAMP synthase (cGAS), and subsequently bound by the stimulator of interferon genes (STING) to trigger interferon response. Thus, the cGAS-cGAMP-STING pathway plays a critical role in pathogen detection, as well as pathophysiological conditions including cancer and autoimmune disorders. However, studying and targeting this immune signaling pathway has been challenging due to the absence of tools for high-throughput analysis. We have engineered an RNA-based fluorescent biosensor that responds to 2',3'-cGAMP. The resulting "mix-and-go" cGAS activity assay shows excellent statistical reliability as a high-throughput screening (HTS) assay and distinguishes between direct and indirect cGAS inhibitors. Furthermore, the biosensor enables quantitation of 2',3'-cGAMP in mammalian cell lysates. We envision this biosensor-based assay as a resource to study the cGAS-cGAMP-STING pathway in the context of infectious diseases, cancer immunotherapy, and autoimmune diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. High-throughput quantitative biochemical characterization of algal biomass by NIR spectroscopy; multiple linear regression and multivariate linear regression analysis.

    Science.gov (United States)

    Laurens, L M L; Wolfrum, E J

    2013-12-18

    One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.

  8. Bacterial Pathogens and Community Composition in Advanced Sewage Treatment Systems Revealed by Metagenomics Analysis Based on High-Throughput Sequencing

    Science.gov (United States)

    Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying

    2015-01-01

    This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence. PMID:25938416

  9. Intelligent Approach for Analysis of Respiratory Signals and Oxygen Saturation in the Sleep Apnea/Hypopnea Syndrome

    Science.gov (United States)

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712

  10. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging.

    Science.gov (United States)

    Pandey, Piyush; Ge, Yufeng; Stoerger, Vincent; Schnable, James C

    2017-01-01

    Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo . These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N), phosphorus (P), potassium (K), magnesium (Mg), calcium (Ca), and sulfur (S), and micronutrients sodium (Na), iron (Fe), manganese (Mn), boron (B), copper (Cu), and zinc (Zn). Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [ R 2 = 0.93 and RPD (Ratio of Performance to Deviation) = 3.8]. All macronutrients were also quantified satisfactorily ( R 2 from 0.69 to 0.92, RPD from 1.62 to 3.62), with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy ( R 2 from 0.19 to 0.86, RPD from 1.09 to 2.69) than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily ( R 2 plant chemical traits. Future

  11. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Science.gov (United States)

    Pandey, Piyush; Ge, Yufeng; Stoerger, Vincent; Schnable, James C.

    2017-01-01

    Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N), phosphorus (P), potassium (K), magnesium (Mg), calcium (Ca), and sulfur (S), and micronutrients sodium (Na), iron (Fe), manganese (Mn), boron (B), copper (Cu), and zinc (Zn). Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation) = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62), with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69) than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 designing experiments to vary plant nutrients

  12. Fast neutron (14 MeV) attenuation analysis in saturated core samples and its application in well logging

    International Nuclear Information System (INIS)

    Amin Attarzadeh; Mohammad Kamal Ghassem Al Askari; Tagy Bayat

    2009-01-01

    To introduce the application of nuclear logging, it is appropriate to provide a motivation for the use of nuclear measurement techniques in well logging. Importance aspects of the geological sciences are for instance grain and porosity structure and porosity volume of the rocks, as well as the transport properties of a fluid in the porous media. Nuclear measurements are, as a rule non-intrusive. Namely, a measurement does not destroy the sample, and it does not interfere with the process to be measured. Also, non- intrusive measurements are often much faster than the radiation methods, and can also be applied in field measurements. A common type of nuclear measurement employs neutron irradiation. It is powerful technique for geophysical analysis. In this research we illustrate the detail of this technique and it's applications to well logging and oil industry. Experiments have been performed to investigate the possibilities of using neutron attenuation measurements to determine water and oil content of rock sample. A beam of 14 MeV neutrons produced by a 150 KV neutron generator was attenuated by different samples and subsequently detected with plastic scintillators NE102 (Fast counter). Each sample was saturated with water and oil. The difference in neutron attenuation between dry and wet samples was compared with the fluid content determined by mass balance of the sample. In this experiment we were able to determine 3% of humidity in standard sample model (SiO 2 ) and estimate porosity in geological samples when saturated with different fluids. (Author)

  13. Comparison of Atmospheric Pressure Chemical Ionization and Field Ionization Mass Spectrometry for the Analysis of Large Saturated Hydrocarbons.

    Science.gov (United States)

    Jin, Chunfen; Viidanoja, Jyrki; Li, Mingzhe; Zhang, Yuyang; Ikonen, Elias; Root, Andrew; Romanczyk, Mark; Manheim, Jeremy; Dziekonski, Eric; Kenttämaa, Hilkka I

    2016-11-01

    Direct infusion atmospheric pressure chemical ionization mass spectrometry (APCI-MS) was compared to field ionization mass spectrometry (FI-MS) for the determination of hydrocarbon class distributions in lubricant base oils. When positive ion mode APCI with oxygen as the ion source gas was employed to ionize saturated hydrocarbon model compounds (M) in hexane, only stable [M - H] + ions were produced. Ion-molecule reaction studies performed in a linear quadrupole ion trap suggested that fragment ions of ionized hexane can ionize saturated hydrocarbons via hydride abstraction with minimal fragmentation. Hence, APCI-MS shows potential as an alternative of FI-MS in lubricant base oil analysis. Indeed, the APCI-MS method gave similar average molecular weights and hydrocarbon class distributions as FI-MS for three lubricant base oils. However, the reproducibility of APCI-MS method was found to be substantially better than for FI-MS. The paraffinic content determined using the APCI-MS and FI-MS methods for the base oils was similar. The average number of carbons in paraffinic chains followed the same increasing trend from low viscosity to high viscosity base oils for the two methods.

  14. Analysis of high-throughput biological data using their rank values.

    Science.gov (United States)

    Dembélé, Doulaye

    2018-01-01

    High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .

  15. Analysis of Active Methylotrophic Communities: When DNA-SIP Meets High-Throughput Technologies.

    Science.gov (United States)

    Taubert, Martin; Grob, Carolina; Howat, Alexandra M; Burns, Oliver J; Chen, Yin; Neufeld, Josh D; Murrell, J Colin

    2016-01-01

    Methylotrophs are microorganisms ubiquitous in the environment that can metabolize one-carbon (C1) compounds as carbon and/or energy sources. The activity of these prokaryotes impacts biogeochemical cycles within their respective habitats and can determine whether these habitats act as sources or sinks of C1 compounds. Due to the high importance of C1 compounds, not only in biogeochemical cycles, but also for climatic processes, it is vital to understand the contributions of these microorganisms to carbon cycling in different environments. One of the most challenging questions when investigating methylotrophs, but also in environmental microbiology in general, is which species contribute to the environmental processes of interest, or "who does what, where and when?" Metabolic labeling with C1 compounds substituted with (13)C, a technique called stable isotope probing, is a key method to trace carbon fluxes within methylotrophic communities. The incorporation of (13)C into the biomass of active methylotrophs leads to an increase in the molecular mass of their biomolecules. For DNA-based stable isotope probing (DNA-SIP), labeled and unlabeled DNA is separated by isopycnic ultracentrifugation. The ability to specifically analyze DNA of active methylotrophs from a complex background community by high-throughput sequencing techniques, i.e. targeted metagenomics, is the hallmark strength of DNA-SIP for elucidating ecosystem functioning, and a protocol is detailed in this chapter.

  16. Perchlorate reduction by hydrogen autotrophic bacteria and microbial community analysis using high-throughput sequencing.

    Science.gov (United States)

    Wan, Dongjin; Liu, Yongde; Niu, Zhenhua; Xiao, Shuhu; Li, Daorong

    2016-02-01

    Hydrogen autotrophic reduction of perchlorate have advantages of high removal efficiency and harmless to drinking water. But so far the reported information about the microbial community structure was comparatively limited, changes in the biodiversity and the dominant bacteria during acclimation process required detailed study. In this study, perchlorate-reducing hydrogen autotrophic bacteria were acclimated by hydrogen aeration from activated sludge. For the first time, high-throughput sequencing was applied to analyze changes in biodiversity and the dominant bacteria during acclimation process. The Michaelis-Menten model described the perchlorate reduction kinetics well. Model parameters q(max) and K(s) were 2.521-3.245 (mg ClO4(-)/gVSS h) and 5.44-8.23 (mg/l), respectively. Microbial perchlorate reduction occurred across at pH range 5.0-11.0; removal was highest at pH 9.0. The enriched mixed bacteria could use perchlorate, nitrate and sulfate as electron accepter, and the sequence of preference was: NO3(-) > ClO4(-) > SO4(2-). Compared to the feed culture, biodiversity decreased greatly during acclimation process, the microbial community structure gradually stabilized after 9 acclimation cycles. The Thauera genus related to Rhodocyclales was the dominated perchlorate reducing bacteria (PRB) in the mixed culture.

  17. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Science.gov (United States)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2015-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. PMID:25155200

  18. High-throughput DNA methylation analysis in anorexia nervosa confirms TNXB hypermethylation.

    Science.gov (United States)

    Kesselmeier, Miriam; Pütter, Carolin; Volckmar, Anna-Lena; Baurecht, Hansjörg; Grallert, Harald; Illig, Thomas; Ismail, Khadeeja; Ollikainen, Miina; Silén, Yasmina; Keski-Rahkonen, Anna; Bulik, Cynthia M; Collier, David A; Zeggini, Eleftheria; Hebebrand, Johannes; Scherag, André; Hinney, Anke

    2018-04-01

    Patients with anorexia nervosa (AN) are ideally suited to identify differentially methylated genes in response to starvation. We examined high-throughput DNA methylation derived from whole blood of 47 females with AN, 47 lean females without AN and 100 population-based females to compare AN with both controls. To account for different cell type compositions, we applied two reference-free methods (FastLMM-EWASher, RefFreeEWAS) and searched for consensus CpG sites identified by both methods. We used a validation sample of five monozygotic AN-discordant twin pairs. Fifty-one consensus sites were identified in AN vs. lean and 81 in AN vs. population-based comparisons. These sites have not been reported in AN methylation analyses, but for the latter comparison 54/81 sites showed directionally consistent differential methylation effects in the AN-discordant twins. For a single nucleotide polymorphism rs923768 in CSGALNACT1 a nearby site was nominally associated with AN. At the gene level, we confirmed hypermethylated sites at TNXB. We found support for a locus at NR1H3 in the AN vs. lean control comparison, but the methylation direction was opposite to the one previously reported. We confirm genes like TNXB previously described to comprise differentially methylated sites, and highlight further sites that might be specifically involved in AN starvation processes.

  19. High-throughput STR analysis for DNA database using direct PCR.

    Science.gov (United States)

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to <3%. In conclusion, the newly developed direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner. © 2013 American Academy of Forensic Sciences Published 2013. This article is a U.S. Government work and is in the public domain in the U.S.A.

  20. High-throughput molecular analysis in lung cancer: insights into biology and potential clinical applications.

    Science.gov (United States)

    Ocak, S; Sos, M L; Thomas, R K; Massion, P P

    2009-08-01

    During the last decade, high-throughput technologies including genomic, epigenomic, transcriptomic and proteomic have been applied to further our understanding of the molecular pathogenesis of this heterogeneous disease, and to develop strategies that aim to improve the management of patients with lung cancer. Ultimately, these approaches should lead to sensitive, specific and noninvasive methods for early diagnosis, and facilitate the prediction of response to therapy and outcome, as well as the identification of potential novel therapeutic targets. Genomic studies were the first to move this field forward by providing novel insights into the molecular biology of lung cancer and by generating candidate biomarkers of disease progression. Lung carcinogenesis is driven by genetic and epigenetic alterations that cause aberrant gene function; however, the challenge remains to pinpoint the key regulatory control mechanisms and to distinguish driver from passenger alterations that may have a small but additive effect on cancer development. Epigenetic regulation by DNA methylation and histone modifications modulate chromatin structure and, in turn, either activate or silence gene expression. Proteomic approaches critically complement these molecular studies, as the phenotype of a cancer cell is determined by proteins and cannot be predicted by genomics or transcriptomics alone. The present article focuses on the technological platforms available and some proposed clinical applications. We illustrate herein how the "-omics" have revolutionised our approach to lung cancer biology and hold promise for personalised management of lung cancer.

  1. High-throughput analysis for preparation, processing and analysis of TiO2 coatings on steel by chemical solution deposition

    International Nuclear Information System (INIS)

    Cuadrado Gil, Marcos; Van Driessche, Isabel; Van Gils, Sake; Lommens, Petra; Castelein, Pieter; De Buysser, Klaartje

    2012-01-01

    Highlights: ► High-throughput preparation of TiO 2 aqueous precursors. ► Analysis of stability and surface tension. ► Deposition of TiO 2 coatings. - Abstract: A high-throughput preparation, processing and analysis of titania coatings prepared by chemical solution deposition from water-based precursors at low temperature (≈250 °C) on two different types of steel substrates (Aluzinc® and bright annealed) is presented. The use of the high-throughput equipment allows fast preparation of multiple samples saving time, energy and material; and helps to test the scalability of the process. The process itself includes the use of IR curing for aqueous ceramic precursors and possibilities of using UV irradiation before the final sintering step. The IR curing method permits a much faster curing step compared to normal high temperature treatments in traditional convection devices (i.e., tube furnaces). The formulations, also prepared by high-throughput equipment, are found to be stable in the operational pH range of the substrates (6.5–8.5). Titanium alkoxides itself lack stability in pure water-based environments, but the presence of the different organic complexing agents prevents it from hydrolysis and precipitation reactions. The wetting interaction between the substrates and the various formulations is studied by the determination of the surface free energy of the substrates and the polar and dispersive components of the surface tension of the solutions. The mild temperature program used for preparation of the coatings however does not lead to the formation of pure crystalline material, necessary for the desired photocatalytic and super-hydrophilic behavior of these coatings. Nevertheless, some activity can be reported for these amorphous coatings by monitoring the discoloration of methylene blue in water under UV irradiation.

  2. Pathway Processor 2.0: a web resource for pathway-based analysis of high-throughput data.

    Science.gov (United States)

    Beltrame, Luca; Bianco, Luca; Fontana, Paolo; Cavalieri, Duccio

    2013-07-15

    Pathway Processor 2.0 is a web application designed to analyze high-throughput datasets, including but not limited to microarray and next-generation sequencing, using a pathway centric logic. In addition to well-established methods such as the Fisher's test and impact analysis, Pathway Processor 2.0 offers innovative methods that convert gene expression into pathway expression, leading to the identification of differentially regulated pathways in a dataset of choice. Pathway Processor 2.0 is available as a web service at http://compbiotoolbox.fmach.it/pathwayProcessor/. Sample datasets to test the functionality can be used directly from the application. duccio.cavalieri@fmach.it Supplementary data are available at Bioinformatics online.

  3. Validation of a Microscale Extraction and High Throughput UHPLC-QTOF-MS Analysis Method for Huperzine A in Huperzia

    Science.gov (United States)

    Cuthbertson, Daniel; Piljac-Žegarac, Jasenka; Lange, Bernd Markus

    2011-01-01

    Herein we report on an improved method for the microscale extraction of huperzine A (HupA), an acetylcholinesterase-inhibiting alkaloid, from as little as 3 mg of tissue homogenate from the clubmoss Huperzia squarrosa (G. Forst.) Trevis with 99.95 % recovery. We also validated a novel UHPLC-QTOF-MS method for the high-throughput analysis of H. squarrosa extracts in only 6 min, which, in combination with the very low limit of detection (20 pg on column) and the wide linear range for quantification (20 to 10,000 pg on column), allow for a highly efficient screening of extracts containing varying amounts of HupA. Utilization of this methodology has the potential to conserve valuable plant resources. PMID:22275140

  4. Theoretical analysis of quantum dot amplifiers with high saturation power and low noise figure

    DEFF Research Database (Denmark)

    Berg, Tommy Winther; Mørk, Jesper

    2002-01-01

    Semiconductor quantum dot amplifiers are predicted to exhibit superior characteristics such as high gain, and output power and low noise. The analysis provides criteria and design guidelines for the realization of high quality amplifiers.......Semiconductor quantum dot amplifiers are predicted to exhibit superior characteristics such as high gain, and output power and low noise. The analysis provides criteria and design guidelines for the realization of high quality amplifiers....

  5. High-throughput transcriptome analysis of barley (Hordeum vulgare) exposed to excessive boron.

    Science.gov (United States)

    Tombuloglu, Guzin; Tombuloglu, Huseyin; Sakcali, M Serdal; Unver, Turgay

    2015-02-15

    Boron (B) is an essential micronutrient for optimum plant growth. However, above certain threshold B is toxic and causes yield loss in agricultural lands. While a number of studies were conducted to understand B tolerance mechanism, a transcriptome-wide approach for B tolerant barley is performed here for the first time. A high-throughput RNA-Seq (cDNA) sequencing technology (Illumina) was used with barley (Hordeum vulgare), yielding 208 million clean reads. In total, 256,874 unigenes were generated and assigned to known peptide databases: Gene Ontology (GO) (99,043), Swiss-Prot (38,266), Clusters of Orthologous Groups (COG) (26,250), and the Kyoto Encyclopedia of Genes and Genomes (KEGG) (36,860), as determined by BLASTx search. According to the digital gene expression (DGE) analyses, 16% and 17% of the transcripts were found to be differentially regulated in root and leaf tissues, respectively. Most of them were involved in cell wall, stress response, membrane, protein kinase and transporter mechanisms. Some of the genes detected as highly expressed in root tissue are phospholipases, predicted divalent heavy-metal cation transporters, formin-like proteins and calmodulin/Ca(2+)-binding proteins. In addition, chitin-binding lectin precursor, ubiquitin carboxyl-terminal hydrolase, and serine/threonine-protein kinase AFC2 genes were indicated to be highly regulated in leaf tissue upon excess B treatment. Some pathways, such as the Ca(2+)-calmodulin system, are activated in response to B toxicity. The differential regulation of 10 transcripts was confirmed by qRT-PCR, revealing the tissue-specific responses against B toxicity and their putative function in B-tolerance mechanisms. Copyright © 2014. Published by Elsevier B.V.

  6. High-throughput mutational analysis of TOR1A in primary dystonia

    Directory of Open Access Journals (Sweden)

    Truong Daniel D

    2009-03-01

    Full Text Available Abstract Background Although the c.904_906delGAG mutation in Exon 5 of TOR1A typically manifests as early-onset generalized dystonia, DYT1 dystonia is genetically and clinically heterogeneous. Recently, another Exon 5 mutation (c.863G>A has been associated with early-onset generalized dystonia and some ΔGAG mutation carriers present with late-onset focal dystonia. The aim of this study was to identify TOR1A Exon 5 mutations in a large cohort of subjects with mainly non-generalized primary dystonia. Methods High resolution melting (HRM was used to examine the entire TOR1A Exon 5 coding sequence in 1014 subjects with primary dystonia (422 spasmodic dysphonia, 285 cervical dystonia, 67 blepharospasm, 41 writer's cramp, 16 oromandibular dystonia, 38 other primary focal dystonia, 112 segmental dystonia, 16 multifocal dystonia, and 17 generalized dystonia and 250 controls (150 neurologically normal and 100 with other movement disorders. Diagnostic sensitivity and specificity were evaluated in an additional 8 subjects with known ΔGAG DYT1 dystonia and 88 subjects with ΔGAG-negative dystonia. Results HRM of TOR1A Exon 5 showed high (100% diagnostic sensitivity and specificity. HRM was rapid and economical. HRM reliably differentiated the TOR1A ΔGAG and c.863G>A mutations. Melting curves were normal in 250/250 controls and 1012/1014 subjects with primary dystonia. The two subjects with shifted melting curves were found to harbor the classic ΔGAG deletion: 1 a non-Jewish Caucasian female with childhood-onset multifocal dystonia and 2 an Ashkenazi Jewish female with adolescent-onset spasmodic dysphonia. Conclusion First, HRM is an inexpensive, diagnostically sensitive and specific, high-throughput method for mutation discovery. Second, Exon 5 mutations in TOR1A are rarely associated with non-generalized primary dystonia.

  7. Analysis of high-throughput sequencing and annotation strategies for phage genomes.

    Directory of Open Access Journals (Sweden)

    Matthew R Henn

    Full Text Available BACKGROUND: Bacterial viruses (phages play a critical role in shaping microbial populations as they influence both host mortality and horizontal gene transfer. As such, they have a significant impact on local and global ecosystem function and human health. Despite their importance, little is known about the genomic diversity harbored in phages, as methods to capture complete phage genomes have been hampered by the lack of knowledge about the target genomes, and difficulties in generating sufficient quantities of genomic DNA for sequencing. Of the approximately 550 phage genomes currently available in the public domain, fewer than 5% are marine phage. METHODOLOGY/PRINCIPAL FINDINGS: To advance the study of phage biology through comparative genomic approaches we used marine cyanophage as a model system. We compared DNA preparation methodologies (DNA extraction directly from either phage lysates or CsCl purified phage particles, and sequencing strategies that utilize either Sanger sequencing of a linker amplification shotgun library (LASL or of a whole genome shotgun library (WGSL, or 454 pyrosequencing methods. We demonstrate that genomic DNA sample preparation directly from a phage lysate, combined with 454 pyrosequencing, is best suited for phage genome sequencing at scale, as this method is capable of capturing complete continuous genomes with high accuracy. In addition, we describe an automated annotation informatics pipeline that delivers high-quality annotation and yields few false positives and negatives in ORF calling. CONCLUSIONS/SIGNIFICANCE: These DNA preparation, sequencing and annotation strategies enable a high-throughput approach to the burgeoning field of phage genomics.

  8. Transcriptomic analysis of Petunia hybrida in response to salt stress using high throughput RNA sequencing.

    Directory of Open Access Journals (Sweden)

    Gonzalo H Villarino

    Full Text Available Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments.

  9. Transcriptomic analysis of Petunia hybrida in response to salt stress using high throughput RNA sequencing.

    Science.gov (United States)

    Villarino, Gonzalo H; Bombarely, Aureliano; Giovannoni, James J; Scanlon, Michael J; Mattson, Neil S

    2014-01-01

    Salinity and drought stress are the primary cause of crop losses worldwide. In sodic saline soils sodium chloride (NaCl) disrupts normal plant growth and development. The complex interactions of plant systems with abiotic stress have made RNA sequencing a more holistic and appealing approach to study transcriptome level responses in a single cell and/or tissue. In this work, we determined the Petunia transcriptome response to NaCl stress by sequencing leaf samples and assembling 196 million Illumina reads with Trinity software. Using our reference transcriptome we identified more than 7,000 genes that were differentially expressed within 24 h of acute NaCl stress. The proposed transcriptome can also be used as an excellent tool for biological and bioinformatics in the absence of an available Petunia genome and it is available at the SOL Genomics Network (SGN) http://solgenomics.net. Genes related to regulation of reactive oxygen species, transport, and signal transductions as well as novel and undescribed transcripts were among those differentially expressed in response to salt stress. The candidate genes identified in this study can be applied as markers for breeding or to genetically engineer plants to enhance salt tolerance. Gene Ontology analyses indicated that most of the NaCl damage happened at 24 h inducing genotoxicity, affecting transport and organelles due to the high concentration of Na+ ions. Finally, we report a modification to the library preparation protocol whereby cDNA samples were bar-coded with non-HPLC purified primers, without affecting the quality and quantity of the RNA-seq data. The methodological improvement presented here could substantially reduce the cost of sample preparation for future high-throughput RNA sequencing experiments.

  10. Stability Analysis and Internal Heating Effect on Oscillatory Convection in a Viscoelastic Fluid Saturated Porous Medium Under Gravity Modulation

    Science.gov (United States)

    Bhadauria, B. S.; Singh, M. K.; Singh, A.; Singh, B. K.; Kiran, P.

    2016-12-01

    In this paper, we investigate the combined effect of internal heating and time periodic gravity modulation in a viscoelastic fluid saturated porous medium by reducing the problem into a complex non-autonomous Ginzgburg-Landau equation. Weak nonlinear stability analysis has been performed by using power series expansion in terms of the amplitude of gravity modulation, which is assumed to be small. The Nusselt number is obtained in terms of the amplitude for oscillatory mode of convection. The influence of viscoelastic parameters on heat transfer has been discussed. Gravity modulation is found to have a destabilizing effect at low frequencies and a stabilizing effect at high frequencies. Finally, it is found that overstability advances the onset of convection, more with internal heating. The conditions for which the complex Ginzgburg-Landau equation undergoes Hopf bifurcation and the amplitude equation undergoes supercritical pitchfork bifurcation are studied.

  11. Stability Analysis and Internal Heating Effect on Oscillatory Convection in a Viscoelastic Fluid Saturated Porous Medium Under Gravity Modulation

    Directory of Open Access Journals (Sweden)

    Bhadauria B.S.

    2016-12-01

    Full Text Available In this paper, we investigate the combined effect of internal heating and time periodic gravity modulation in a viscoelastic fluid saturated porous medium by reducing the problem into a complex non-autonomous Ginzgburg-Landau equation. Weak nonlinear stability analysis has been performed by using power series expansion in terms of the amplitude of gravity modulation, which is assumed to be small. The Nusselt number is obtained in terms of the amplitude for oscillatory mode of convection. The influence of viscoelastic parameters on heat transfer has been discussed. Gravity modulation is found to have a destabilizing effect at low frequencies and a stabilizing effect at high frequencies. Finally, it is found that overstability advances the onset of convection, more with internal heating. The conditions for which the complex Ginzgburg-Landau equation undergoes Hopf bifurcation and the amplitude equation undergoes supercritical pitchfork bifurcation are studied.

  12. Magnetic saturation in semi-analytical harmonic modeling for electric machine analysis

    NARCIS (Netherlands)

    Sprangers, R.L.J.; Paulides, J.J.H.; Gysen, B.L.J.; Lomonova, E.

    2016-01-01

    A semi-analytical method based on the harmonic modeling (HM) technique is presented for the analysis of the magneto-static field distribution in the slotted structure of rotating electric machines. In contrast to the existing literature, the proposed model does not require the assumption of infinite

  13. High-throughput SHAPE analysis reveals structures in HIV-1 genomic RNA strongly conserved across distinct biological states.

    Directory of Open Access Journals (Sweden)

    Kevin A Wilkinson

    2008-04-01

    Full Text Available Replication and pathogenesis of the human immunodeficiency virus (HIV is tightly linked to the structure of its RNA genome, but genome structure in infectious virions is poorly understood. We invent high-throughput SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension technology, which uses many of the same tools as DNA sequencing, to quantify RNA backbone flexibility at single-nucleotide resolution and from which robust structural information can be immediately derived. We analyze the structure of HIV-1 genomic RNA in four biologically instructive states, including the authentic viral genome inside native particles. Remarkably, given the large number of plausible local structures, the first 10% of the HIV-1 genome exists in a single, predominant conformation in all four states. We also discover that noncoding regions functioning in a regulatory role have significantly lower (p-value < 0.0001 SHAPE reactivities, and hence more structure, than do viral coding regions that function as the template for protein synthesis. By directly monitoring protein binding inside virions, we identify the RNA recognition motif for the viral nucleocapsid protein. Seven structurally homologous binding sites occur in a well-defined domain in the genome, consistent with a role in directing specific packaging of genomic RNA into nascent virions. In addition, we identify two distinct motifs that are targets for the duplex destabilizing activity of this same protein. The nucleocapsid protein destabilizes local HIV-1 RNA structure in ways likely to facilitate initial movement both of the retroviral reverse transcriptase from its tRNA primer and of the ribosome in coding regions. Each of the three nucleocapsid interaction motifs falls in a specific genome domain, indicating that local protein interactions can be organized by the long-range architecture of an RNA. High-throughput SHAPE reveals a comprehensive view of HIV-1 RNA genome structure, and further

  14. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy.

    Science.gov (United States)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis G; De Francisci, Davide; Valle, Giorgio; Angelidaki, Irini

    2016-01-01

    Biogas production is an economically attractive technology that has gained momentum worldwide over the past years. Biogas is produced by a biologically mediated process, widely known as "anaerobic digestion." This process is performed by a specialized and complex microbial community, in which different members have distinct roles in the establishment of a collective organization. Deciphering the complex microbial community engaged in this process is interesting both for unraveling the network of bacterial interactions and for applicability potential to the derived knowledge. In this study, we dissect the bioma involved in anaerobic digestion by means of high throughput Illumina sequencing (~51 gigabases of sequence data), disclosing nearly one million genes and extracting 106 microbial genomes by a novel strategy combining two binning processes. Microbial phylogeny and putative taxonomy performed using >400 proteins revealed that the biogas community is a trove of new species. A new approach based on functional properties as per network representation was developed to assign roles to the microbial species. The organization of the anaerobic digestion microbiome is resembled by a funnel concept, in which the microbial consortium presents a progressive functional specialization while reaching the final step of the process (i.e., methanogenesis). Key microbial genomes encoding enzymes involved in specific metabolic pathways, such as carbohydrates utilization, fatty acids degradation, amino acids fermentation, and syntrophic acetate oxidation, were identified. Additionally, the analysis identified a new uncultured archaeon that was putatively related to Methanomassiliicoccales but surprisingly having a methylotrophic methanogenic pathway. This study is a pioneer research on the phylogenetic and functional characterization of the microbial community populating biogas reactors. By applying for the first time high-throughput sequencing and a novel binning strategy, the

  15. High throughput LC-MS/MS method for the simultaneous analysis of multiple vitamin D analytes in serum.

    Science.gov (United States)

    Jenkinson, Carl; Taylor, Angela E; Hassan-Smith, Zaki K; Adams, John S; Stewart, Paul M; Hewison, Martin; Keevil, Brian G

    2016-03-01

    Recent studies suggest that vitamin D-deficiency is linked to increased risk of common human health problems. To define vitamin D 'status' most routine analytical methods quantify one particular vitamin D metabolite, 25-hydroxyvitamin D3 (25OHD3). However, vitamin D is characterized by complex metabolic pathways, and simultaneous measurement of multiple vitamin D metabolites may provide a more accurate interpretation of vitamin D status. To address this we developed a high-throughput liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to analyse multiple vitamin D analytes, with particular emphasis on the separation of epimer metabolites. A supportive liquid-liquid extraction (SLE) and LC-MS/MS method was developed to quantify 10 vitamin D metabolites as well as separation of an interfering 7α-hydroxy-4-cholesten-3-one (7αC4) isobar (precursor of bile acid), and validated by analysis of human serum samples. In a cohort of 116 healthy subjects, circulating concentrations of 25-hydroxyvitamin D3 (25OHD3), 3-epi-25-hydroxyvitamin D3 (3-epi-25OHD3), 24,25-dihydroxyvitamin D3 (24R,25(OH)2D3), 1,25-dihydroxyvitamin D3 (1α,25(OH)2D3), and 25-hydroxyvitamin D2 (25OHD2) were quantifiable using 220μL of serum, with 25OHD3 and 24R,25(OH)2D3 showing significant seasonal variations. This high-throughput LC-MS/MS method provides a novel strategy for assessing the impact of vitamin D on human health and disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Dynamical analysis of cigarette smoking model with a saturated incidence rate

    Science.gov (United States)

    Zeb, Anwar; Bano, Ayesha; Alzahrani, Ebraheem; Zaman, Gul

    2018-04-01

    In this paper, we consider a delayed smoking model in which the potential smokers are assumed to satisfy the logistic equation. We discuss the dynamical behavior of our proposed model in the form of Delayed Differential Equations (DDEs) and show conditions for asymptotic stability of the model in steady state. We also discuss the Hopf bifurcation analysis of considered model. Finally, we use the nonstandard finite difference (NSFD) scheme to show the results graphically with help of MATLAB.

  17. Misconceptions in Reporting Oxygen Saturation

    NARCIS (Netherlands)

    Toffaletti, John; Zijlstra, Willem G.

    2007-01-01

    BACKGROUND: We describe some misconceptions that have become common practice in reporting blood gas and cooximetry results. In 1980, oxygen saturation was incorrectly redefined in a report of a new instrument for analysis of hemoglobin (Hb) derivatives. Oxygen saturation (sO(2)) was redefined as the

  18. Energy-Efficiency Analysis of a Distributed Queuing Medium Access Control Protocol for Biomedical Wireless Sensor Networks in Saturation Conditions

    Directory of Open Access Journals (Sweden)

    Christos Verikoukis

    2011-01-01

    Full Text Available The aging population and the high quality of life expectations in our society lead to the need of more efficient and affordable healthcare solutions. For this reason, this paper aims for the optimization of Medium Access Control (MAC protocols for biomedical wireless sensor networks or wireless Body Sensor Networks (BSNs. The hereby presented schemes always have in mind the efficient management of channel resources and the overall minimization of sensors’ energy consumption in order to prolong sensors’ battery life. The fact that the IEEE 802.15.4 MAC does not fully satisfy BSN requirements highlights the need for the design of new scalable MAC solutions, which guarantee low-power consumption to the maximum number of body sensors in high density areas (i.e., in saturation conditions. In order to emphasize IEEE 802.15.4 MAC limitations, this article presents a detailed overview of this de facto standard for Wireless Sensor Networks (WSNs, which serves as a link for the introduction and initial description of our here proposed Distributed Queuing (DQ MAC protocol for BSN scenarios. Within this framework, an extensive DQ MAC energy-consumption analysis in saturation conditions is presented to be able to evaluate its performance in relation to IEEE 802.5.4 MAC in highly dense BSNs. The obtained results show that the proposed scheme outperforms IEEE 802.15.4 MAC in average energy consumption per information bit, thus providing a better overall performance that scales appropriately to BSNs under high traffic conditions. These benefits are obtained by eliminating back-off periods and collisions in data packet transmissions, while minimizing the control overhead.

  19. Poincaré analysis of an overnight arterial oxygen saturation signal applied to the diagnosis of sleep apnea hypopnea syndrome

    International Nuclear Information System (INIS)

    Morillo, Daniel S; Rojas, Juan L; Crespo, Luis F; León, Antonio; Gross, Nicole

    2009-01-01

    The analysis of oxygen desaturations is a basic variable in polysomnographic studies for the diagnosis of sleep apnea. Several algorithms operating in the time domain already exist for sleep apnea detection via pulse oximetry, but in a disadvantageous way—they achieve either a high sensitivity or a high specificity. The aim of this study was to assess whether an alternative analysis of arterial oxygen saturation (SaO 2 ) signals from overnight pulse oximetry could yield essential information on the diagnosis of sleep apnea hypopnea syndrome (SAHS). SaO 2 signals from 117 subjects were analyzed. The population was divided into a learning dataset (70 patients) and a test set (47 patients). The learning set was used for tuning thresholds among the applied Poincaré quantitative descriptors. Results showed that the presence of apnea events in SAHS patients caused an increase in the SD 1 Poincaré parameter. This conclusion was assessed prospectively using the test dataset. 90.9% sensitivity and 84.0% specificity were obtained in the test group. We conclude that Poincaré analysis could be useful in the study of SAHS, contributing to reduce the demand for polysomnographic studies in SAHS screening

  20. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  1. A Reference Viral Database (RVDB) To Enhance Bioinformatics Analysis of High-Throughput Sequencing for Novel Virus Detection.

    Science.gov (United States)

    Goodacre, Norman; Aljanahi, Aisha; Nandakumar, Subhiksha; Mikailov, Mike; Khan, Arifa S

    2018-01-01

    Detection of distantly related viruses by high-throughput sequencing (HTS) is bioinformatically challenging because of the lack of a public database containing all viral sequences, without abundant nonviral sequences, which can extend runtime and obscure viral hits. Our reference viral database (RVDB) includes all viral, virus-related, and virus-like nucleotide sequences (excluding bacterial viruses), regardless of length, and with overall reduced cellular sequences. Semantic selection criteria (SEM-I) were used to select viral sequences from GenBank, resulting in a first-generation viral database (VDB). This database was manually and computationally reviewed, resulting in refined, semantic selection criteria (SEM-R), which were applied to a new download of updated GenBank sequences to create a second-generation VDB. Viral entries in the latter were clustered at 98% by CD-HIT-EST to reduce redundancy while retaining high viral sequence diversity. The viral identity of the clustered representative sequences (creps) was confirmed by BLAST searches in NCBI databases and HMMER searches in PFAM and DFAM databases. The resulting RVDB contained a broad representation of viral families, sequence diversity, and a reduced cellular content; it includes full-length and partial sequences and endogenous nonretroviral elements, endogenous retroviruses, and retrotransposons. Testing of RVDBv10.2, with an in-house HTS transcriptomic data set indicated a significantly faster run for virus detection than interrogating the entirety of the NCBI nonredundant nucleotide database, which contains all viral sequences but also nonviral sequences. RVDB is publically available for facilitating HTS analysis, particularly for novel virus detection. It is meant to be updated on a regular basis to include new viral sequences added to GenBank. IMPORTANCE To facilitate bioinformatics analysis of high-throughput sequencing (HTS) data for the detection of both known and novel viruses, we have

  2. Laser desorption mass spectrometry for high-throughput DNA analysis and its applications

    Science.gov (United States)

    Chen, C. H. Winston; Golovlev, Valeri V.; Taranenko, N. I.; Allman, S. L.; Isola, Narayana R.; Potter, N. T.; Matteson, K. J.; Chang, Linus Y.

    1999-05-01

    Laser desorption mass spectrometry (LDMS) has been developed for DNA sequencing, disease diagnosis, and DNA fingerprinting for forensic applications. With LDMS, the speed of DNA analysis can be much faster than conventional gel electrophoresis. No dye or radioactive tagging to DNA segments for detection is needed. LDMS is emerging as a new alternative technology for DNA analysis.

  3. Analysis of high-throughput plant image data with the information system IAP

    Directory of Open Access Journals (Sweden)

    Klukas Christian

    2012-06-01

    Full Text Available This work presents a sophisticated information system, the Integrated Analysis Platform (IAP, an approach supporting large-scale image analysis for different species and imaging systems. In its current form, IAP supports the investigation of Maize, Barley and Arabidopsis plants based on images obtained in different spectra.

  4. NIR and Py-mbms coupled with multivariate data analysis as a high-throughput biomass characterization technique : a review

    Directory of Open Access Journals (Sweden)

    Li eXiao

    2014-08-01

    Full Text Available Optimizing the use of lignocellulosic biomass as the feedstock for renewable energy production is currently being developed globally. Biomass is a complex mixture of cellulose, hemicelluloses, lignins, extractives, and proteins; as well as inorganic salts. Cell wall compositional analysis for biomass characterization is laborious and time consuming. In order to characterize biomass fast and efficiently, several high through-put technologies have been successfully developed. Among them, near infrared spectroscopy (NIR and pyrolysis-molecular beam mass spectrometry (Py-mbms are complementary tools and capable of evaluating a large number of raw or modified biomass in a short period of time. NIR shows vibrations associated with specific chemical structures whereas Py-mbms depicts the full range of fragments from the decomposition of biomass. Both NIR vibrations and Py-mbms peaks are assigned to possible chemical functional groups and molecular structures. They provide complementary information of chemical insight of biomaterials. However, it is challenging to interpret the informative results because of the large amount of overlapping bands or decomposition fragments contained in the spectra. In order to improve the efficiency of data analysis, multivariate analysis tools have been adapted to define the significant correlations among data variables, so that the large number of bands/peaks could be replaced by a small number of reconstructed variables representing original variation. Reconstructed data variables are used for sample comparison (principal component analysis and for building regression models (partial least square regression between biomass chemical structures and properties of interests. In this review, the important biomass chemical structures measured by NIR and Py-mbms are summarized. The advantages and disadvantages of conventional data analysis methods and multivariate data analysis methods are introduced, compared and evaluated

  5. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    Science.gov (United States)

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    Science.gov (United States)

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  7. GxGrare: gene-gene interaction analysis method for rare variants from high-throughput sequencing data.

    Science.gov (United States)

    Kwon, Minseok; Leem, Sangseob; Yoon, Joon; Park, Taesung

    2018-03-19

    With the rapid advancement of array-based genotyping techniques, genome-wide association studies (GWAS) have successfully identified common genetic variants associated with common complex diseases. However, it has been shown that only a small proportion of the genetic etiology of complex diseases could be explained by the genetic factors identified from GWAS. This missing heritability could possibly be explained by gene-gene interaction (epistasis) and rare variants. There has been an exponential growth of gene-gene interaction analysis for common variants in terms of methodological developments and practical applications. Also, the recent advancement of high-throughput sequencing technologies makes it possible to conduct rare variant analysis. However, little progress has been made in gene-gene interaction analysis for rare variants. Here, we propose GxGrare which is a new gene-gene interaction method for the rare variants in the framework of the multifactor dimensionality reduction (MDR) analysis. The proposed method consists of three steps; 1) collapsing the rare variants, 2) MDR analysis for the collapsed rare variants, and 3) detect top candidate interaction pairs. GxGrare can be used for the detection of not only gene-gene interactions, but also interactions within a single gene. The proposed method is illustrated with 1080 whole exome sequencing data of the Korean population in order to identify causal gene-gene interaction for rare variants for type 2 diabetes. The proposed GxGrare performs well for gene-gene interaction detection with collapsing of rare variants. GxGrare is available at http://bibs.snu.ac.kr/software/gxgrare which contains simulation data and documentation. Supported operating systems include Linux and OS X.

  8. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yonghua [Iowa State Univ., Ames, IA (United States)

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  9. CSReport: A New Computational Tool Designed for Automatic Analysis of Class Switch Recombination Junctions Sequenced by High-Throughput Sequencing.

    Science.gov (United States)

    Boyer, François; Boutouil, Hend; Dalloul, Iman; Dalloul, Zeinab; Cook-Moreau, Jeanne; Aldigier, Jean-Claude; Carrion, Claire; Herve, Bastien; Scaon, Erwan; Cogné, Michel; Péron, Sophie

    2017-05-15

    B cells ensure humoral immune responses due to the production of Ag-specific memory B cells and Ab-secreting plasma cells. In secondary lymphoid organs, Ag-driven B cell activation induces terminal maturation and Ig isotype class switch (class switch recombination [CSR]). CSR creates a virtually unique IgH locus in every B cell clone by intrachromosomal recombination between two switch (S) regions upstream of each C region gene. Amount and structural features of CSR junctions reveal valuable information about the CSR mechanism, and analysis of CSR junctions is useful in basic and clinical research studies of B cell functions. To provide an automated tool able to analyze large data sets of CSR junction sequences produced by high-throughput sequencing (HTS), we designed CSReport, a software program dedicated to support analysis of CSR recombination junctions sequenced with a HTS-based protocol (Ion Torrent technology). CSReport was assessed using simulated data sets of CSR junctions and then used for analysis of Sμ-Sα and Sμ-Sγ1 junctions from CH12F3 cells and primary murine B cells, respectively. CSReport identifies junction segment breakpoints on reference sequences and junction structure (blunt-ended junctions or junctions with insertions or microhomology). Besides the ability to analyze unprecedentedly large libraries of junction sequences, CSReport will provide a unified framework for CSR junction studies. Our results show that CSReport is an accurate tool for analysis of sequences from our HTS-based protocol for CSR junctions, thereby facilitating and accelerating their study. Copyright © 2017 by The American Association of Immunologists, Inc.

  10. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    Science.gov (United States)

    Kinoshita, Manabu; Sakai, Mio; Arita, Hideyuki; Shofuda, Tomoko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Hashimoto, Naoya; Fujimoto, Yasunori; Yoshimine, Toshiki; Nakanishi, Katsuyuki; Kanemura, Yonehiro

    2016-01-01

    Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006) and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73). Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively). ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  11. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    2009-12-01

    Full Text Available Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges.We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences.Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is superior to TU-counting plus Sanger

  12. A new high-throughput LC-MS method for the analysis of complex fructan mixtures

    DEFF Research Database (Denmark)

    Verspreet, Joran; Hansen, Anders Holmgaard; Dornez, Emmie

    2014-01-01

    In this paper, a new liquid chromatography-mass spectrometry (LC-MS) method for the analysis of complex fructan mixtures is presented. In this method, columns with a trifunctional C18 alkyl stationary phase (T3) were used and their performance compared with that of a porous graphitized carbon (PGC...

  13. MSP-HTPrimer: a high-throughput primer design tool to improve assay design for DNA methylation analysis in epigenetics.

    Science.gov (United States)

    Pandey, Ram Vinay; Pulverer, Walter; Kallmeyer, Rainer; Beikircher, Gabriel; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Bisulfite (BS) conversion-based and methylation-sensitive restriction enzyme (MSRE)-based PCR methods have been the most commonly used techniques for locus-specific DNA methylation analysis. However, both methods have advantages and limitations. Thus, an integrated approach would be extremely useful to quantify the DNA methylation status successfully with great sensitivity and specificity. Designing specific and optimized primers for target regions is the most critical and challenging step in obtaining the adequate DNA methylation results using PCR-based methods. Currently, no integrated, optimized, and high-throughput methylation-specific primer design software methods are available for both BS- and MSRE-based methods. Therefore an integrated, powerful, and easy-to-use methylation-specific primer design pipeline with great accuracy and success rate will be very useful. We have developed a new web-based pipeline, called MSP-HTPrimer, to design primers pairs for MSP, BSP, pyrosequencing, COBRA, and MSRE assays on both genomic strands. First, our pipeline converts all target sequences into bisulfite-treated templates for both forward and reverse strand and designs all possible primer pairs, followed by filtering for single nucleotide polymorphisms (SNPs) and known repeat regions. Next, each primer pairs are annotated with the upstream and downstream RefSeq genes, CpG island, and cut sites (for COBRA and MSRE). Finally, MSP-HTPrimer selects specific primers from both strands based on custom and user-defined hierarchical selection criteria. MSP-HTPrimer produces a primer pair summary output table in TXT and HTML format for display and UCSC custom tracks for resulting primer pairs in GTF format. MSP-HTPrimer is an integrated, web-based, and high-throughput pipeline and has no limitation on the number and size of target sequences and designs MSP, BSP, pyrosequencing, COBRA, and MSRE assays. It is the only pipeline, which automatically designs primers on both genomic

  14. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles

    Science.gov (United States)

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  15. Hydrogel Based 3-Dimensional (3D System for Toxicity and High-Throughput (HTP Analysis for Cultured Murine Ovarian Follicles.

    Directory of Open Access Journals (Sweden)

    Hong Zhou

    Full Text Available Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN, preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR. The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased

  16. HTSSIP: An R package for analysis of high throughput sequencing data from nucleic acid stable isotope probing (SIP experiments.

    Directory of Open Access Journals (Sweden)

    Nicholas D Youngblut

    Full Text Available Combining high throughput sequencing with stable isotope probing (HTS-SIP is a powerful method for mapping in situ metabolic processes to thousands of microbial taxa. However, accurately mapping metabolic processes to taxa is complex and challenging. Multiple HTS-SIP data analysis methods have been developed, including high-resolution stable isotope probing (HR-SIP, multi-window high-resolution stable isotope probing (MW-HR-SIP, quantitative stable isotope probing (qSIP, and ΔBD. Currently, there is no publicly available software designed specifically for analyzing HTS-SIP data. To address this shortfall, we have developed the HTSSIP R package, an open-source, cross-platform toolset for conducting HTS-SIP analyses in a straightforward and easily reproducible manner. The HTSSIP package, along with full documentation and examples, is available from CRAN at https://cran.r-project.org/web/packages/HTSSIP/index.html and Github at https://github.com/buckleylab/HTSSIP.

  17. In situ analysis and structural elucidation of sainfoin (Onobrychis viciifolia) tannins for high-throughput germplasm screening.

    Science.gov (United States)

    Gea, An; Stringano, Elisabetta; Brown, Ron H; Mueller-Harvey, Irene

    2011-01-26

    A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin ( Onobrychis viciifolia ) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6-113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.

  18. High-throughput metagenomic analysis of petroleum-contaminated soil microbiome reveals the versatility in xenobiotic aromatics metabolism.

    Science.gov (United States)

    Bao, Yun-Juan; Xu, Zixiang; Li, Yang; Yao, Zhi; Sun, Jibin; Song, Hui

    2017-06-01

    The soil with petroleum contamination is one of the most studied soil ecosystems due to its rich microorganisms for hydrocarbon degradation and broad applications in bioremediation. However, our understanding of the genomic properties and functional traits of the soil microbiome is limited. In this study, we used high-throughput metagenomic sequencing to comprehensively study the microbial community from petroleum-contaminated soils near Tianjin Dagang oilfield in eastern China. The analysis reveals that the soil metagenome is characterized by high level of community diversity and metabolic versatility. The metageome community is predominated by γ-Proteobacteria and α-Proteobacteria, which are key players for petroleum hydrocarbon degradation. The functional study demonstrates over-represented enzyme groups and pathways involved in degradation of a broad set of xenobiotic aromatic compounds, including toluene, xylene, chlorobenzoate, aminobenzoate, DDT, methylnaphthalene, and bisphenol. A composite metabolic network is proposed for the identified pathways, thus consolidating our identification of the pathways. The overall data demonstrated the great potential of the studied soil microbiome in the xenobiotic aromatics degradation. The results not only establish a rich reservoir for novel enzyme discovery but also provide putative applications in bioremediation. Copyright © 2016. Published by Elsevier B.V.

  19. Secure and robust cloud computing for high-throughput forensic microsatellite sequence analysis and databasing.

    Science.gov (United States)

    Bailey, Sarah F; Scheible, Melissa K; Williams, Christopher; Silva, Deborah S B S; Hoggan, Marina; Eichman, Christopher; Faith, Seth A

    2017-11-01

    Next-generation Sequencing (NGS) is a rapidly evolving technology with demonstrated benefits for forensic genetic applications, and the strategies to analyze and manage the massive NGS datasets are currently in development. Here, the computing, data storage, connectivity, and security resources of the Cloud were evaluated as a model for forensic laboratory systems that produce NGS data. A complete front-to-end Cloud system was developed to upload, process, and interpret raw NGS data using a web browser dashboard. The system was extensible, demonstrating analysis capabilities of autosomal and Y-STRs from a variety of NGS instrumentation (Illumina MiniSeq and MiSeq, and Oxford Nanopore MinION). NGS data for STRs were concordant with standard reference materials previously characterized with capillary electrophoresis and Sanger sequencing. The computing power of the Cloud was implemented with on-demand auto-scaling to allow multiple file analysis in tandem. The system was designed to store resulting data in a relational database, amenable to downstream sample interpretations and databasing applications following the most recent guidelines in nomenclature for sequenced alleles. Lastly, a multi-layered Cloud security architecture was tested and showed that industry standards for securing data and computing resources were readily applied to the NGS system without disadvantageous effects for bioinformatic analysis, connectivity or data storage/retrieval. The results of this study demonstrate the feasibility of using Cloud-based systems for secured NGS data analysis, storage, databasing, and multi-user distributed connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Science.gov (United States)

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  1. Independent component analysis applied to pulse oximetry in the estimation of the arterial oxygen saturation (SpO2) - a comparative study

    DEFF Research Database (Denmark)

    Jensen, Thomas; Duun, Sune Bro; Larsen, Jan

    2009-01-01

    We examine various independent component analysis (ICA) digital signal processing algorithms for estimating the arterial oxygen saturation (SpO2) as measured by a reflective pulse oximeter. The ICA algorithms examined are FastICA, Maximum Likelihood ICA (ICAML), Molgedey and Schuster ICA (ICAMS......), and Mean Field ICA (ICAMF). The signal processing includes pre-processing bandpass filtering to eliminate noise, and post-processing by calculating the SpO2. The algorithms are compared to the commercial state-of-the-art algorithm Discrete Saturation Transform (DST) by Masimo Corporation...

  2. Big data scalability for high throughput processing and analysis of vehicle engineering data

    OpenAIRE

    Lu, Feng

    2017-01-01

    "Sympathy for Data" is a platform that is utilized for Big Data automation analytics. It is based on visual interface and workflow configurations. The main purpose of the platform is to reuse parts of code for structured analysis of vehicle engineering data. However, there are some performance issues on a single machine for processing a large amount of data in Sympathy for Data. There are also disk and CPU IO intensive issues when the data is oversized and the platform need fits comfortably i...

  3. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  4. Micropathogen Community Analysis in Hyalomma rufipes via High-Throughput Sequencing of Small RNAs

    Science.gov (United States)

    Luo, Jin; Liu, Min-Xuan; Ren, Qiao-Yun; Chen, Ze; Tian, Zhan-Cheng; Hao, Jia-Wei; Wu, Feng; Liu, Xiao-Cui; Luo, Jian-Xun; Yin, Hong; Wang, Hui; Liu, Guang-Yuan

    2017-01-01

    Ticks are important vectors in the transmission of a broad range of micropathogens to vertebrates, including humans. Because of the role of ticks in disease transmission, identifying and characterizing the micropathogen profiles of tick populations have become increasingly important. The objective of this study was to survey the micropathogens of Hyalomma rufipes ticks. Illumina HiSeq2000 technology was utilized to perform deep sequencing of small RNAs (sRNAs) extracted from field-collected H. rufipes ticks in Gansu Province, China. The resultant sRNA library data revealed that the surveyed tick populations produced reads that were homologous to St. Croix River Virus (SCRV) sequences. We also observed many reads that were homologous to microbial and/or pathogenic isolates, including bacteria, protozoa, and fungi. As part of this analysis, a phylogenetic tree was constructed to display the relationships among the homologous sequences that were identified. The study offered a unique opportunity to gain insight into the micropathogens of H. rufipes ticks. The effective control of arthropod vectors in the future will require knowledge of the micropathogen composition of vectors harboring infectious agents. Understanding the ecological factors that regulate vector propagation in association with the prevalence and persistence of micropathogen lineages is also imperative. These interactions may affect the evolution of micropathogen lineages, especially if the micropathogens rely on the vector or host for dispersal. The sRNA deep-sequencing approach used in this analysis provides an intuitive method to survey micropathogen prevalence in ticks and other vector species. PMID:28861401

  5. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Malia A. Gehan

    2017-12-01

    Full Text Available Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  6. FIM imaging and FIMtrack: two new tools allowing high-throughput and cost effective locomotion analysis.

    Science.gov (United States)

    Risse, Benjamin; Otto, Nils; Berh, Dimitri; Jiang, Xiaoyi; Klämbt, Christian

    2014-12-24

    The analysis of neuronal network function requires a reliable measurement of behavioral traits. Since the behavior of freely moving animals is variable to a certain degree, many animals have to be analyzed, to obtain statistically significant data. This in turn requires a computer assisted automated quantification of locomotion patterns. To obtain high contrast images of almost translucent and small moving objects, a novel imaging technique based on frustrated total internal reflection called FIM was developed. In this setup, animals are only illuminated with infrared light at the very specific position of contact with the underlying crawling surface. This methodology results in very high contrast images. Subsequently, these high contrast images are processed using established contour tracking algorithms. Based on this, we developed the FIMTrack software, which serves to extract a number of features needed to quantitatively describe a large variety of locomotion characteristics. During the development of this software package, we focused our efforts on an open source architecture allowing the easy addition of further modules. The program operates platform independent and is accompanied by an intuitive GUI guiding the user through data analysis. All locomotion parameter values are given in form of csv files allowing further data analyses. In addition, a Results Viewer integrated into the tracking software provides the opportunity to interactively review and adjust the output, as might be needed during stimulus integration. The power of FIM and FIMTrack is demonstrated by studying the locomotion of Drosophila larvae.

  7. μTAS (micro total analysis systems) for the high-throughput measurement of nanomaterial solubility

    International Nuclear Information System (INIS)

    Tantra, R; Jarman, J

    2013-01-01

    There is a consensus in the nanoecotoxicology community that better analytical tools i.e. faster and more accurate ones, are needed for the physicochemical characterisation of nanomaterials in environmentally/biologically relevant media. In this study, we introduce the concept of μTAS (Micro Total Analysis Systems), which was a term coined to encapsulate the integration of laboratory processes on a single microchip. Our focus here is on the use of a capillary electrophoresis (CE) with conductivity detection microchip and how this may be used for the measurement of dissolution of metal oxide nanomaterials. Our preliminary results clearly show promise in that the device is able to: a) measure ionic zinc in various ecotox media with high selectivity b) track the dynamic dissolution events of zinc oxide (ZnO) nanomaterial when dispersed in fish medium.

  8. Peptide Pattern Recognition for high-throughput protein sequence analysis and clustering

    DEFF Research Database (Denmark)

    Busk, Peter Kamp

    2017-01-01

    Large collections of protein sequences with divergent sequences are tedious to analyze for understanding their phylogenetic or structure-function relation. Peptide Pattern Recognition is an algorithm that was developed to facilitate this task but the previous version does only allow a limited...... number of sequences as input. I implemented Peptide Pattern Recognition as a multithread software designed to handle large numbers of sequences and perform analysis in a reasonable time frame. Benchmarking showed that the new implementation of Peptide Pattern Recognition is twenty times faster than...... the previous implementation on a small protein collection with 673 MAP kinase sequences. In addition, the new implementation could analyze a large protein collection with 48,570 Glycosyl Transferase family 20 sequences without reaching its upper limit on a desktop computer. Peptide Pattern Recognition...

  9. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  10. Designing small universal k-mer hitting sets for improved analysis of high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Yaron Orenstein

    2017-10-01

    Full Text Available With the rapidly increasing volume of deep sequencing data, more efficient algorithms and data structures are needed. Minimizers are a central recent paradigm that has improved various sequence analysis tasks, including hashing for faster read overlap detection, sparse suffix arrays for creating smaller indexes, and Bloom filters for speeding up sequence search. Here, we propose an alternative paradigm that can lead to substantial further improvement in these and other tasks. For integers k and L > k, we say that a set of k-mers is a universal hitting set (UHS if every possible L-long sequence must contain a k-mer from the set. We develop a heuristic called DOCKS to find a compact UHS, which works in two phases: The first phase is solved optimally, and for the second we propose several efficient heuristics, trading set size for speed and memory. The use of heuristics is motivated by showing the NP-hardness of a closely related problem. We show that DOCKS works well in practice and produces UHSs that are very close to a theoretical lower bound. We present results for various values of k and L and by applying them to real genomes show that UHSs indeed improve over minimizers. In particular, DOCKS uses less than 30% of the 10-mers needed to span the human genome compared to minimizers. The software and computed UHSs are freely available at github.com/Shamir-Lab/DOCKS/ and acgt.cs.tau.ac.il/docks/, respectively.

  11. Quantitative proteomic analysis for high-throughput screening of differential glycoproteins in hepatocellular carcinoma serum

    International Nuclear Information System (INIS)

    Gao, Hua-Jun; Chen, Ya-Jing; Zuo, Duo; Xiao, Ming-Ming; Li, Ying; Guo, Hua; Zhang, Ning; Chen, Rui-Bing

    2015-01-01

    Hepatocellular carcinoma (HCC) is a leading cause of cancer-related deaths. Novel serum biomarkers are required to increase the sensitivity and specificity of serum screening for early HCC diagnosis. This study employed a quantitative proteomic strategy to analyze the differential expression of serum glycoproteins between HCC and normal control serum samples. Lectin affinity chromatography (LAC) was used to enrich glycoproteins from the serum samples. Quantitative mass spectrometric analysis combined with stable isotope dimethyl labeling and 2D liquid chromatography (LC) separations were performed to examine the differential levels of the detected proteins between HCC and control serum samples. Western blot was used to analyze the differential expression levels of the three serum proteins. A total of 2,280 protein groups were identified in the serum samples from HCC patients by using the 2D LC-MS/MS method. Up to 36 proteins were up-regulated in the HCC serum, whereas 19 proteins were down-regulated. Three differential glycoproteins, namely, fibrinogen gamma chain (FGG), FOS-like antigen 2 (FOSL2), and α-1,6-mannosylglycoprotein 6-β-N-acetylglucosaminyltransferase B (MGAT5B) were validated by Western blot. All these three proteins were up-regulated in the HCC serum samples. A quantitative glycoproteomic method was established and proven useful to determine potential novel biomarkers for HCC

  12. Genetic analysis and gene mapping of a low stigma exposed mutant gene by high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Xiao Ma

    Full Text Available Rice is one of the main food crops and several studies have examined the molecular mechanism of the exposure of the rice plant stigma. The improvement in the exposure of the stigma in female parent hybrid combinations can enhance the efficiency of hybrid breeding. In the present study, a mutant plant with low exposed stigma (lesr was discovered among the descendants of the indica thermo-sensitive sterile line 115S. The ES% rate of the mutant decreased by 70.64% compared with the wild type variety. The F2 population was established by genetic analysis considering the mutant as the female parent and the restorer line 93S as the male parent. The results indicated a normal F1 population, while a clear division was noted for the high and low exposed stigma groups, respectively. This process was possible only by a ES of 25% in the F2 population. This was in agreement with the ratio of 3:1, which indicated that the mutant was controlled by a recessive main-effect QTL locus, temporarily named as LESR. Genome-wide comparison of the SNP profiles between the early, high and low production bulks were constructed from F2 plants using bulked segregant analysis in combination with high-throughput sequencing technology. The results demonstrated that the candidate loci was located on the chromosome 10 of the rice. Following screening of the recombinant rice plants with newly developed molecular markers, the genetic region was narrowed down to 0.25 Mb. This region was flanked by InDel-2 and InDel-2 at the physical location from 13.69 to 13.94 Mb. Within this region, 7 genes indicated base differences between parents. A total of 2 genes exhibited differences at the coding region and upstream of the coding region, respectively. The present study aimed to further clone the LESR gene, verify its function and identify the stigma variation.

  13. BioVLAB-MMIA-NGS: microRNA-mRNA integrated analysis using high-throughput sequencing data.

    Science.gov (United States)

    Chae, Heejoon; Rhee, Sungmin; Nephew, Kenneth P; Kim, Sun

    2015-01-15

    It is now well established that microRNAs (miRNAs) play a critical role in regulating gene expression in a sequence-specific manner, and genome-wide efforts are underway to predict known and novel miRNA targets. However, the integrated miRNA-mRNA analysis remains a major computational challenge, requiring powerful informatics systems and bioinformatics expertise. The objective of this study was to modify our widely recognized Web server for the integrated mRNA-miRNA analysis (MMIA) and its subsequent deployment on the Amazon cloud (BioVLAB-MMIA) to be compatible with high-throughput platforms, including next-generation sequencing (NGS) data (e.g. RNA-seq). We developed a new version called the BioVLAB-MMIA-NGS, deployed on both Amazon cloud and on a high-performance publicly available server called MAHA. By using NGS data and integrating various bioinformatics tools and databases, BioVLAB-MMIA-NGS offers several advantages. First, sequencing data is more accurate than array-based methods for determining miRNA expression levels. Second, potential novel miRNAs can be detected by using various computational methods for characterizing miRNAs. Third, because miRNA-mediated gene regulation is due to hybridization of an miRNA to its target mRNA, sequencing data can be used to identify many-to-many relationship between miRNAs and target genes with high accuracy. http://epigenomics.snu.ac.kr/biovlab_mmia_ngs/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Nonlinear mixed effects dose response modeling in high throughput drug screens: application to melanoma cell line analysis.

    Science.gov (United States)

    Ding, Kuan-Fu; Petricoin, Emanuel F; Finlay, Darren; Yin, Hongwei; Hendricks, William P D; Sereduk, Chris; Kiefer, Jeffrey; Sekulic, Aleksandar; LoRusso, Patricia M; Vuori, Kristiina; Trent, Jeffrey M; Schork, Nicholas J

    2018-01-12

    Cancer cell lines are often used in high throughput drug screens (HTS) to explore the relationship between cell line characteristics and responsiveness to different therapies. Many current analysis methods infer relationships by focusing on one aspect of cell line drug-specific dose-response curves (DRCs), the concentration causing 50% inhibition of a phenotypic endpoint (IC 50 ). Such methods may overlook DRC features and do not simultaneously leverage information about drug response patterns across cell lines, potentially increasing false positive and negative rates in drug response associations. We consider the application of two methods, each rooted in nonlinear mixed effects (NLME) models, that test the relationship relationships between estimated cell line DRCs and factors that might mitigate response. Both methods leverage estimation and testing techniques that consider the simultaneous analysis of different cell lines to draw inferences about any one cell line. One of the methods is designed to provide an omnibus test of the differences between cell line DRCs that is not focused on any one aspect of the DRC (such as the IC 50 value). We simulated different settings and compared the different methods on the simulated data. We also compared the proposed methods against traditional IC 50 -based methods using 40 melanoma cell lines whose transcriptomes, proteomes, and, importantly, BRAF and related mutation profiles were available. Ultimately, we find that the NLME-based methods are more robust, powerful and, for the omnibus test, more flexible, than traditional methods. Their application to the melanoma cell lines reveals insights into factors that may be clinically useful.

  15. A community resource for high-throughput quantitative RT-PCR analysis of transcription factor gene expression in Medicago truncatula

    Directory of Open Access Journals (Sweden)

    Redman Julia C

    2008-07-01

    Full Text Available Abstract Background Medicago truncatula is a model legume species that is currently the focus of an international genome sequencing effort. Although several different oligonucleotide and cDNA arrays have been produced for genome-wide transcript analysis of this species, intrinsic limitations in the sensitivity of hybridization-based technologies mean that transcripts of genes expressed at low-levels cannot be measured accurately with these tools. Amongst such genes are many encoding transcription factors (TFs, which are arguably the most important class of regulatory proteins. Quantitative reverse transcription-polymerase chain reaction (qRT-PCR is the most sensitive method currently available for transcript quantification, and one that can be scaled up to analyze transcripts of thousands of genes in parallel. Thus, qRT-PCR is an ideal method to tackle the problem of TF transcript quantification in Medicago and other plants. Results We established a bioinformatics pipeline to identify putative TF genes in Medicago truncatula and to design gene-specific oligonucleotide primers for qRT-PCR analysis of TF transcripts. We validated the efficacy and gene-specificity of over 1000 TF primer pairs and utilized these to identify sets of organ-enhanced TF genes that may play important roles in organ development or differentiation in this species. This community resource will be developed further as more genome sequence becomes available, with the ultimate goal of producing validated, gene-specific primers for all Medicago TF genes. Conclusion High-throughput qRT-PCR using a 384-well plate format enables rapid, flexible, and sensitive quantification of all predicted Medicago transcription factor mRNAs. This resource has been utilized recently by several groups in Europe, Australia, and the USA, and we expect that it will become the 'gold-standard' for TF transcript profiling in Medicago truncatula.

  16. Region Templates: Data Representation and Management for High-Throughput Image Analysis.

    Science.gov (United States)

    Teodoro, George; Pan, Tony; Kurc, Tahsin; Kong, Jun; Cooper, Lee; Klasky, Scott; Saltz, Joel

    2014-12-01

    We introduce a region template abstraction and framework for the efficient storage, management and processing of common data types in analysis of large datasets of high resolution images on clusters of hybrid computing nodes. The region template abstraction provides a generic container template for common data structures, such as points, arrays, regions, and object sets, within a spatial and temporal bounding box. It allows for different data management strategies and I/O implementations, while providing a homogeneous, unified interface to applications for data storage and retrieval. A region template application is represented as a hierarchical dataflow in which each computing stage may be represented as another dataflow of finer-grain tasks. The execution of the application is coordinated by a runtime system that implements optimizations for hybrid machines, including performance-aware scheduling for maximizing the utilization of computing devices and techniques to reduce the impact of data transfers between CPUs and GPUs. An experimental evaluation on a state-of-the-art hybrid cluster using a microscopy imaging application shows that the abstraction adds negligible overhead (about 3%) and achieves good scalability and high data transfer rates. Optimizations in a high speed disk based storage implementation of the abstraction to support asynchronous data transfers and computation result in an application performance gain of about 1.13×. Finally, a processing rate of 11,730 4K×4K tiles per minute was achieved for the microscopy imaging application on a cluster with 100 nodes (300 GPUs and 1,200 CPU cores). This computation rate enables studies with very large datasets.

  17. High Throughput Petrochronology and Sedimentary Provenance Analysis by Automated Phase Mapping and LAICPMS

    Science.gov (United States)

    Vermeesch, Pieter; Rittner, Martin; Petrou, Ethan; Omma, Jenny; Mattinson, Chris; Garzanti, Eduardo

    2017-11-01

    The first step in most geochronological studies is to extract dateable minerals from the host rock, which is time consuming, removes textural context, and increases the chance for sample cross contamination. We here present a new method to rapidly perform in situ analyses by coupling a fast scanning electron microscope (SEM) with Energy Dispersive X-ray Spectrometer (EDS) to a Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LAICPMS) instrument. Given a polished hand specimen, a petrographic thin section, or a grain mount, Automated Phase Mapping (APM) by SEM/EDS produces chemical and mineralogical maps from which the X-Y coordinates of the datable minerals are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic analysis. We apply the APM + LAICPMS method to three igneous, metamorphic, and sedimentary case studies. In the first case study, a polished slab of granite from Guernsey was scanned for zircon, producing a 609 ± 8 Ma weighted mean age. The second case study investigates a paragneiss from an ultra high pressure terrane in the north Qaidam terrane (Qinghai, China). One hundred seven small (25 µm) metamorphic zircons were analyzed by LAICPMS to confirm a 419 ± 4 Ma age of peak metamorphism. The third and final case study uses APM + LAICPMS to generate a large provenance data set and trace the provenance of 25 modern sediments from Angola, documenting longshore drift of Orange River sediments over a distance of 1,500 km. These examples demonstrate that APM + LAICPMS is an efficient and cost effective way to improve the quantity and quality of geochronological data.

  18. High-Throughput Genetic Analysis and Combinatorial Chiral Separations Based on Capillary Electrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Zhong, Wenwan [Iowa State Univ., Ames, IA (United States)

    2003-01-01

    Capillary electrophoresis (CE) offers many advantages over conventional analytical methods, such as speed, simplicity, high resolution, low cost, and small sample consumption, especially for the separation of enantiomers. However, chiral method developments still can be time consuming and tedious. They designed a comprehensive enantioseparation protocol employing neutral and sulfated cyclodextrins as chiral selectors for common basic, neutral, and acidic compounds with a 96-capillary array system. By using only four judiciously chosen separation buffers, successful enantioseparations were achieved for 49 out of 54 test compounds spanning a large variety of pKs and structures. Therefore, unknown compounds can be screened in this manner to identify optimal enantioselective conditions in just one rn. In addition to superior separation efficiency for small molecules, CE is also the most powerful technique for DNA separations. Using the same multiplexed capillary system with UV absorption detection, the sequence of a short DNA template can be acquired without any dye-labels. Two internal standards were utilized to adjust the migration time variations among capillaries, so that the four electropherograms for the A, T, C, G Sanger reactions can be aligned and base calling can be completed with a high level of confidence. the CE separation of DNA can be applied to study differential gene expression as well. Combined with pattern recognition techniques, small variations among electropherograms obtained by the separation of cDNA fragments produced from the total RNA samples of different human tissues can be revealed. These variations reflect the differences in total RNA expression among tissues. Thus, this Ce-based approach can serve as an alternative to the DNA array techniques in gene expression analysis.

  19. High-throughput tandem mass spectrometry multiplex analysis for newborn urinary screening of creatine synthesis and transport disorders, Triple H syndrome and OTC deficiency.

    Science.gov (United States)

    Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela

    2014-09-25

    Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. ANALYSIS OF THE SPECIAL FEATURES OF THE THERMAL PROCESS IN AN INDUCTION GENERATOR AT HIGH SATURATION OF THE MAGNETIC SYSTEM

    Directory of Open Access Journals (Sweden)

    V. Chenchevoi

    2017-06-01

    Full Text Available Purpose. Development of the method for the assessment of the thermal operation modes of an autonomous electrical power system with an induction motor, aiming at improvement of the reliability of electricity supply and the quality of electric energy. Methodology. Induction generator mathematical modeling taking into account the magnetic system saturation was used in the research. A heat model taking into account the excess of the temperature of the induction generator units in the mode of high saturation was developed. The obtained results were compared with the experimental data. Results. The paper contains the solution to the problem of improvement of the mathematical model sand methods for steel losses determination in there search of the operation modes of an autonomous uncontrolled induction generator taking into consideration the properties of the magnetic system in the mode of high saturation. The expression for determination of steel losses in the mode of high saturation is obtained. It enables the assessment of the induction generator thermal condition. Originality. The analytical dependence for the calculation of the steel losses in the mode of magnetic system saturation has been obtained for the first time. Practical value. The obtained expression for the calculation of the steel losses can be used for determination of the admissible time of generator operation at overload. It will allow avoiding broken winding insulation and complete use of the generator overload capacity. As a result, it will reduce possible irregularities of electricity supply due to the generator preliminary cutoff.

  1. Effectiveness of a high-throughput genetic analysis in the identification of responders/non-responders to CYP2D6-metabolized drugs.

    Science.gov (United States)

    Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A

    2011-01-01

    Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.

  2. Introduction of High Throughput Magnetic Resonance T2-Weighted Image Texture Analysis for WHO Grade 2 and 3 Gliomas.

    Directory of Open Access Journals (Sweden)

    Manabu Kinoshita

    Full Text Available Reports have suggested that tumor textures presented on T2-weighted images correlate with the genetic status of glioma. Therefore, development of an image analyzing framework that is capable of objective and high throughput image texture analysis for large scale image data collection is needed. The current study aimed to address the development of such a framework by introducing two novel parameters for image textures on T2-weighted images, i.e., Shannon entropy and Prewitt filtering. Twenty-two WHO grade 2 and 28 grade 3 glioma patients were collected whose pre-surgical MRI and IDH1 mutation status were available. Heterogeneous lesions showed statistically higher Shannon entropy than homogenous lesions (p = 0.006 and ROC curve analysis proved that Shannon entropy on T2WI was a reliable indicator for discrimination of homogenous and heterogeneous lesions (p = 0.015, AUC = 0.73. Lesions with well-defined borders exhibited statistically higher Edge mean and Edge median values using Prewitt filtering than those with vague lesion borders (p = 0.0003 and p = 0.0005 respectively. ROC curve analysis also proved that both Edge mean and median values were promising indicators for discrimination of lesions with vague and well defined borders and both Edge mean and median values performed in a comparable manner (p = 0.0002, AUC = 0.81 and p < 0.0001, AUC = 0.83, respectively. Finally, IDH1 wild type gliomas showed statistically lower Shannon entropy on T2WI than IDH1 mutated gliomas (p = 0.007 but no difference was observed between IDH1 wild type and mutated gliomas in Edge median values using Prewitt filtering. The current study introduced two image metrics that reflect lesion texture described on T2WI. These two metrics were validated by readings of a neuro-radiologist who was blinded to the results. This observation will facilitate further use of this technique in future large scale image analysis of glioma.

  3. MG-RAST version 4-lessons learned from a decade of low-budget ultra-high-throughput metagenome analysis.

    Science.gov (United States)

    Meyer, Folker; Bagchi, Saurabh; Chaterji, Somali; Gerlach, Wolfgang; Grama, Ananth; Harrison, Travis; Paczian, Tobias; Trimble, William L; Wilke, Andreas

    2017-09-26

    As technologies change, MG-RAST is adapting. Newly available software is being included to improve accuracy and performance. As a computational service constantly running large volume scientific workflows, MG-RAST is the right location to perform benchmarking and implement algorithmic or platform improvements, in many cases involving trade-offs between specificity, sensitivity and run-time cost. The work in [Glass EM, Dribinsky Y, Yilmaz P, et al. ISME J 2014;8:1-3] is an example; we use existing well-studied data sets as gold standards representing different environments and different technologies to evaluate any changes to the pipeline. Currently, we use well-understood data sets in MG-RAST as platform for benchmarking. The use of artificial data sets for pipeline performance optimization has not added value, as these data sets are not presenting the same challenges as real-world data sets. In addition, the MG-RAST team welcomes suggestions for improvements of the workflow. We are currently working on versions 4.02 and 4.1, both of which contain significant input from the community and our partners that will enable double barcoding, stronger inferences supported by longer-read technologies, and will increase throughput while maintaining sensitivity by using Diamond and SortMeRNA. On the technical platform side, the MG-RAST team intends to support the Common Workflow Language as a standard to specify bioinformatics workflows, both to facilitate development and efficient high-performance implementation of the community's data analysis tasks. Published by Oxford University Press on behalf of Entomological Society of America 2017. This work is written by US Government employees and is in the public domain in the US.

  4. Metabolomic and high-throughput sequencing analysis – modern approach for the assessment of biodeterioration of materials from historic buildings

    Directory of Open Access Journals (Sweden)

    Beata eGutarowska

    2015-09-01

    Full Text Available Preservation of cultural heritage is of paramount importance worldwide. Microbial colonization of construction materials, such as wood, brick, mortar and stone in historic buildings can lead to severe deterioration. The aim of the present study was to give modern insight into the phylogenetic diversity and activated metabolic pathways of microbial communities colonized historic objects located in the former Auschwitz II-Birkenau concentration and extermination camp in Oświęcim, Poland. For this purpose we combined molecular, microscopic and chemical methods. Selected specimens were examined using Field Emission Scanning Electron Microscopy (FESEM, metabolomic analysis and high-throughput Illumina sequencing. FESEM imaging revealed the presence of complex microbial communities comprising diatoms, fungi and bacteria, mainly cyanobacteria and actinobacteria, on sample surfaces. Microbial diversity of brick specimens appeared higher than that of the wood and was dominated by algae and cyanobacteria, while wood was mainly colonized by fungi. DNA sequences documented the presence of 15 bacterial phyla representing 99 genera including Halomonas, Halorhodospira, Salinisphaera, Salinibacterium, Rubrobacter, Streptomyces, Arthrobacter and 9 fungal classes represented by 113 genera including Cladosporium, Acremonium, Alternaria, Engyodontium, Penicillium, Rhizopus and Aureobasidium. Most of the identified sequences were characteristic of organisms implicated in deterioration of wood and brick. Metabolomic data indicated the activation of numerous metabolic pathways, including those regulating the production of primary and secondary metabolites, for example, metabolites associated with the production of antibiotics, organic acids and deterioration of organic compounds. The study demonstrated that a combination of electron microscopy imaging with metabolomic and genomic techniques allows to link the phylogenetic information and metabolic profiles of

  5. Metabolomic and high-throughput sequencing analysis-modern approach for the assessment of biodeterioration of materials from historic buildings.

    Science.gov (United States)

    Gutarowska, Beata; Celikkol-Aydin, Sukriye; Bonifay, Vincent; Otlewska, Anna; Aydin, Egemen; Oldham, Athenia L; Brauer, Jonathan I; Duncan, Kathleen E; Adamiak, Justyna; Sunner, Jan A; Beech, Iwona B

    2015-01-01

    Preservation of cultural heritage is of paramount importance worldwide. Microbial colonization of construction materials, such as wood, brick, mortar, and stone in historic buildings can lead to severe deterioration. The aim of the present study was to give modern insight into the phylogenetic diversity and activated metabolic pathways of microbial communities colonized historic objects located in the former Auschwitz II-Birkenau concentration and extermination camp in Oświecim, Poland. For this purpose we combined molecular, microscopic and chemical methods. Selected specimens were examined using Field Emission Scanning Electron Microscopy (FESEM), metabolomic analysis and high-throughput Illumina sequencing. FESEM imaging revealed the presence of complex microbial communities comprising diatoms, fungi and bacteria, mainly cyanobacteria and actinobacteria, on sample surfaces. Microbial diversity of brick specimens appeared higher than that of the wood and was dominated by algae and cyanobacteria, while wood was mainly colonized by fungi. DNA sequences documented the presence of 15 bacterial phyla representing 99 genera including Halomonas, Halorhodospira, Salinisphaera, Salinibacterium, Rubrobacter, Streptomyces, Arthrobacter and nine fungal classes represented by 113 genera including Cladosporium, Acremonium, Alternaria, Engyodontium, Penicillium, Rhizopus, and Aureobasidium. Most of the identified sequences were characteristic of organisms implicated in deterioration of wood and brick. Metabolomic data indicated the activation of numerous metabolic pathways, including those regulating the production of primary and secondary metabolites, for example, metabolites associated with the production of antibiotics, organic acids and deterioration of organic compounds. The study demonstrated that a combination of electron microscopy imaging with metabolomic and genomic techniques allows to link the phylogenetic information and metabolic profiles of microbial communities

  6. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    Science.gov (United States)

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  7. Intestinal microbiota in healthy U.S. young children and adults--a high throughput microarray analysis.

    Directory of Open Access Journals (Sweden)

    Tamar Ringel-Kulka

    Full Text Available It is generally believed that the infant's microbiota is established during the first 1-2 years of life. However, there is scarce data on its characterization and its comparison to the adult-like microbiota in consecutive years.To characterize and compare the intestinal microbiota in healthy young children (1-4 years and healthy adults from the North Carolina region in the U.S. using high-throughput bacterial phylogenetic microarray analysis.Detailed characterization and comparison of the intestinal microbiota of healthy children aged 1-4 years old (n = 28 and healthy adults of 21-60 years (n = 23 was carried out using the Human Intestinal Tract Chip (HITChip phylogenetic microarray targeting the V1 and V6 regions of 16S rRNA and quantitative PCR.The HITChip microarray data indicate that Actinobacteria, Bacilli, Clostridium cluster IV and Bacteroidetes are the predominant phylum-like groups that exhibit differences between young children and adults. The phylum-like group Clostridium cluster XIVa was equally predominant in young children and adults and is thus considered to be established at an early age. The genus-like level show significant 3.6 fold (higher or lower differences in the abundance of 26 genera between young children and adults. Young U.S. children have a significantly 3.5-fold higher abundance of Bifidobacterium species than the adults from the same location. However, the microbiota of young children is less diverse than that of adults.We show that the establishment of an adult-like intestinal microbiota occurs at a later age than previously reported. Characterizing the microbiota and its development in the early years of life may help identify 'windows of opportunity' for interventional strategies that may promote health and prevent or mitigate disease processes.

  8. Foundations of data-intensive science: Technology and practice for high throughput, widely distributed, data management and analysis systems

    Science.gov (United States)

    Johnston, William; Ernst, M.; Dart, E.; Tierney, B.

    2014-04-01

    Today's large-scale science projects involve world-wide collaborations depend on moving massive amounts of data from an instrument to potentially thousands of computing and storage systems at hundreds of collaborating institutions to accomplish their science. This is true for ATLAS and CMS at the LHC, and it is true for the climate sciences, Belle-II at the KEK collider, genome sciences, the SKA radio telescope, and ITER, the international fusion energy experiment. DOE's Office of Science has been collecting science discipline and instrument requirements for network based data management and analysis for more than a decade. As a result of this certain key issues are seen across essentially all science disciplines that rely on the network for significant data transfer, even if the data quantities are modest compared to projects like the LHC experiments. These issues are what this talk will address; to wit: 1. Optical signal transport advances enabling 100 Gb/s circuits that span the globe on optical fiber with each carrying 100 such channels; 2. Network router and switch requirements to support high-speed international data transfer; 3. Data transport (TCP is still the norm) requirements to support high-speed international data transfer (e.g. error-free transmission); 4. Network monitoring and testing techniques and infrastructure to maintain the required error-free operation of the many R&E networks involved in international collaborations; 5. Operating system evolution to support very high-speed network I/O; 6. New network architectures and services in the LAN (campus) and WAN networks to support data-intensive science; 7. Data movement and management techniques and software that can maximize the throughput on the network connections between distributed data handling systems, and; 8. New approaches to widely distributed workflow systems that can support the data movement and analysis required by the science. All of these areas must be addressed to enable large

  9. Comparing the normalization methods for the differential analysis of Illumina high-throughput RNA-Seq data.

    Science.gov (United States)

    Li, Peipei; Piao, Yongjun; Shon, Ho Sun; Ryu, Keun Ho

    2015-10-28

    Recently, rapid improvements in technology and decrease in sequencing costs have made RNA-Seq a widely used technique to quantify gene expression levels. Various normalization approaches have been proposed, owing to the importance of normalization in the analysis of RNA-Seq data. A comparison of recently proposed normalization methods is required to generate suitable guidelines for the selection of the most appropriate approach for future experiments. In this paper, we compared eight non-abundance (RC, UQ, Med, TMM, DESeq, Q, RPKM, and ERPKM) and two abundance estimation normalization methods (RSEM and Sailfish). The experiments were based on real Illumina high-throughput RNA-Seq of 35- and 76-nucleotide sequences produced in the MAQC project and simulation reads. Reads were mapped with human genome obtained from UCSC Genome Browser Database. For precise evaluation, we investigated Spearman correlation between the normalization results from RNA-Seq and MAQC qRT-PCR values for 996 genes. Based on this work, we showed that out of the eight non-abundance estimation normalization methods, RC, UQ, Med, TMM, DESeq, and Q gave similar normalization results for all data sets. For RNA-Seq of a 35-nucleotide sequence, RPKM showed the highest correlation results, but for RNA-Seq of a 76-nucleotide sequence, least correlation was observed than the other methods. ERPKM did not improve results than RPKM. Between two abundance estimation normalization methods, for RNA-Seq of a 35-nucleotide sequence, higher correlation was obtained with Sailfish than that with RSEM, which was better than without using abundance estimation methods. However, for RNA-Seq of a 76-nucleotide sequence, the results achieved by RSEM were similar to without applying abundance estimation methods, and were much better than with Sailfish. Furthermore, we found that adding a poly-A tail increased alignment numbers, but did not improve normalization results. Spearman correlation analysis revealed that RC, UQ

  10. Sensitivity analysis of tracer transport in variably saturated soils at USDA-ARS OPE3 field site

    Science.gov (United States)

    The objective of this study was to assess the effects of uncertainties in hydrologic and geochemical parameters on the results of simulations of the tracer transport in variably saturated soils at the USDA-ARS OPE3 field site. A tracer experiment with a pulse of KCL solution applied to an irrigatio...

  11. Analysis of products of thymine irradiated by 18O8+ ion beam in N2O saturated aqueous solution

    International Nuclear Information System (INIS)

    Cai Xichen; Wei Zengquan; Li Wenjian; Liang Jianping; Li Qiang

    1999-01-01

    Some methods of capillary gas chromatography, such as GC, GC-MS GC-FT-IR, are used to analyze the products of thymine irradiated by 18 O 8+ ion beam in N 2 O saturated aqueous solution. From the results of GC-MS the molecular weight of products can be determined, and from the results of GC-FT-IR some molecular structure information of products can be obtained. By this way the products, 5,6-Dihydro-thymine, 5-Hydroxyl-5-Methylhydantoin, 5-Hydroxyl-6-Hydro-thymine, 5-Hydro-6-Hydroxyl thymine, 5-Hydroxymethyluracil, Trans-Thymine glycol, Cis-Thymine glycol and dimers are determined without separation of them from samples. Though these products are as same as those products of thymine irradiated by γ rays in N 2 O saturated aqueous solution, the mechanism of thymine irradiated by heavy ion beam in aqueous solution is different from that by γ rays. The main products of thymine irradiated by 18 O 8+ ion beam in N 2 O saturated aqueous solution are hydroxyl adducts at 5-6 band of thymine, while the main products of thymine irradiated by γ ray in N 2 O saturated aqueous solution are dimers of thymine

  12. Estimation of changes in saturation and pressure from 4D seismic AVO and time-shift analysis

    NARCIS (Netherlands)

    Trani, M.; Arts, R.; Leeuwenburgh, O.; Brouwer, J.

    2011-01-01

    A reliable estimate of reservoir pressure and fluid saturation changes from time-lapse seismic data is difficult to obtain. Existing methods generally suffer from leakage between the estimated parameters. We propose a new method using different combinations of time-lapse seismic attributes based on

  13. Analysis of grain growth process in melt spun Fe-B alloys under the initial saturated grain boundary segregation condition

    International Nuclear Information System (INIS)

    Chen, Z.; Liu, F.; Yang, X.Q.; Fan, Y.; Shen, C.J.

    2012-01-01

    Highlights: → We compared pure kinetic, pure thermodynamic and extended thermo-kinetic models. → An initial saturated GB segregation condition of nanoscale Fe-B alloys was determined. → The controlled-mechanism was proposed using two characteristic times (t 1 and t 2 ). - Abstract: A grain growth process in the melt spun low-solid-solubility Fe-B alloys was analyzed under the initial saturated grain boundary (GB) segregation condition. Applying melt spinning technique, single-phase supersaturated nanograins were prepared. Grain growth behavior of the single-phase supersaturated nanograins was investigated by performing isothermal annealing at 700 deg. C. Combined with the effect of GB segregation on the initial GB excess amount, the thermo-kinetic model [Chen et al., Acta Mater. 57 (2009) 1466] was extended to describe the initial GB segregation condition of nanoscale Fe-B alloys. In comparison of pure kinetic model, pure thermodynamic model and the extended thermo-kinetic model, an initial saturated GB segregation condition was determined. The controlled-mechanism of grain growth under initial saturated GB segregation condition was proposed using two characteristic annealing times (t 1 and t 2 ), which included a mainly kinetic-controlled process (t ≤ t 1 ), a transition from kinetic-mechanism to thermodynamic-mechanism (t 1 2 ) and pure thermodynamic-controlled process (t ≥ t 2 ).

  14. Automation in Cytomics: A Modern RDBMS Based Platform for Image Analysis and Management in High-Throughput Screening Experiments

    NARCIS (Netherlands)

    E. Larios (Enrique); Y. Zhang (Ying); K. Yan (Kuan); Z. Di; S. LeDévédec (Sylvia); F.E. Groffen (Fabian); F.J. Verbeek

    2012-01-01

    textabstractIn cytomics bookkeeping of the data generated during lab experiments is crucial. The current approach in cytomics is to conduct High-Throughput Screening (HTS) experiments so that cells can be tested under many different experimental conditions. Given the large amount of different

  15. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  16. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  17. Saturated Zone Colloid Transport

    International Nuclear Information System (INIS)

    H. S. Viswanathan

    2004-01-01

    This scientific analysis provides retardation factors for colloids transporting in the saturated zone (SZ) and the unsaturated zone (UZ). These retardation factors represent the reversible chemical and physical filtration of colloids in the SZ. The value of the colloid retardation factor, R col is dependent on several factors, such as colloid size, colloid type, and geochemical conditions (e.g., pH, Eh, and ionic strength). These factors are folded into the distributions of R col that have been developed from field and experimental data collected under varying geochemical conditions with different colloid types and sizes. Attachment rate constants, k att , and detachment rate constants, k det , of colloids to the fracture surface have been measured for the fractured volcanics, and separate R col uncertainty distributions have been developed for attachment and detachment to clastic material and mineral grains in the alluvium. Radionuclides such as plutonium and americium sorb mostly (90 to 99 percent) irreversibly to colloids (BSC 2004 [DIRS 170025], Section 6.3.3.2). The colloid retardation factors developed in this analysis are needed to simulate the transport of radionuclides that are irreversibly sorbed onto colloids; this transport is discussed in the model report ''Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]). Although it is not exclusive to any particular radionuclide release scenario, this scientific analysis especially addresses those scenarios pertaining to evidence from waste-degradation experiments, which indicate that plutonium and americium may be irreversibly attached to colloids for the time scales of interest. A section of this report will also discuss the validity of using microspheres as analogs to colloids in some of the lab and field experiments used to obtain the colloid retardation factors. In addition, a small fraction of colloids travels with the groundwater without any significant retardation. Radionuclides irreversibly

  18. Relationship between intraoperative regional cerebral oxygen saturation trends and cognitive decline after total knee replacement: a post-hoc analysis.

    Science.gov (United States)

    Salazar, Fátima; Doñate, Marta; Boget, Teresa; Bogdanovich, Ana; Basora, Misericordia; Torres, Ferran; Gracia, Isabel; Fàbregas, Neus

    2014-01-01

    Bilateral regional brain oxygen saturation (rSO2) trends, reflecting intraoperative brain oxygen imbalance, could warn of brain dysfunction. Various types of cognitive impairment, such as memory decline, alterations in executive function or subjective complaints, have been described three months after surgery. Our aim was to explore the potential utility of rSO2 values as a warning sign for the development of different types of decline in postoperative psychological function. Observational post-hoc analysis of data for the patient sample (n = 125) of a previously conducted clinical trial in patients over the age of 65 years undergoing total knee replacement under spinal anesthesia. Demographic, hemodynamic and bilateral rSO2 intraoperative values were recorded. An absolute rSO2 value of 20% or >25% below baseline were chosen as relevant cutoffs. Composite function test scores were created from baseline to three months for each patient and adjusted for the mean (SD) score changes for a control group (n = 55). Tests were used to assess visual-motor coordination and executive function (VM-EF) (Wechsler Digit Symbol-Coding and Visual Reproduction, Trail Making Test) and memory (Auditory Verbal Learning, Wechsler Memory Scale); scales were used to assess psychological symptoms. We observed no differences in baseline rSO2 values; rSO2 decreased significantly in all patients during surgery (P Left and right rSO2 values were asymmetric in patients who had memory decline (mean [SD] left-right ratio of 95.03 [8.51] vs 101.29 [6.7] for patients with no changes, P = 0.0012). The mean right-left difference in rSO2 was also significant in these patients (-2.87% [4.73%], lower on the right, P = 0.0034). Detection of a trend to asymmetry in rSO2 values can warn of possible postoperative onset of memory decline. Psychological symptoms and memory decline were common three months after knee replacement in our patients over the age of 65 years.

  19. DETERMINATION OF SATURATION VAPOR PRESSURE OF LOW VOLATILE SUBSTANCES THROUGH THE STUDY OF EVAPORATION RATE BY THERMOGRAVIMETRIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    R. V. Ralys

    2015-11-01

    Full Text Available Subject of Study.Research of vapor pressure of low volatile substances is a complicated problem due to both direct experimental implementation complexity and, most significantly, the issues faced correctness of the analysis and processing of experimental data. That is why it is usually required engaging the reference substances (with vapor pressures well studied. The latter drastically reduces the effectiveness of the experimental methods used and narrows their applicability. The paper deals with an approach to the evaporation process description (sublimation of low volatile substances based on molecular kinetic description in view of diffusive and convection processes. The proposed approach relies on experimental thermogravimetricfindingsina wide range of temperatures, flow rates ofthe purge gas and time. Method. A new approach is based on the calculation of the vapor pressure and uses the data about the speed of evaporation by thermogravimetric analysis depending on the temperature, the flow rate of the purge gas, and the evaporation time. The basis for calculation is the diffusion-kinetic description of the process of evaporation (mass loss of the substance from the exposed surface. The method is applicable to determine the thermodynamic characteristics for both the evaporation (the equilibrium liquid - vapor and sublimation (the equilibrium solid - vapor. We proposed the appropriate method of the experiment and analysis of its data in order to find the saturated vapor pressure of individual substances of low volatility. Main Results. The method has been tested on substances with insufficiently reliable and complete study of the thermodynamic characteristics but, despite this, are often used (because of the other data limitations as reference ones. The vaporization process (liquid-vapor has been studied for di-n-butyl phthalate C16H22O4 at 323,15–443,15 К, and sublimation for benzoic acid C7H6O2at 303,15–183,15 К. Both processes have

  20. Complete relaxation and conformational exchange matrix (CORCEMA) analysis of intermolecular saturation transfer effects in reversibly forming ligand-receptor complexes.

    Science.gov (United States)

    Jayalakshmi, V; Krishna, N Rama

    2002-03-01

    A couple of recent applications of intermolecular NOE (INOE) experiments as applied to biomolecular systems involve the (i) saturation transfer difference NMR (STD-NMR) method and (ii) the intermolecular cross-saturation NMR (ICS-NMR) experiment. STD-NMR is a promising tool for rapid screening of a large library of compounds to identify bioactive ligands binding to a target protein. Additionally, it is also useful in mapping the binding epitopes presented by a bioactive ligand to its target protein. In this latter application, the STD-NMR technique is essentially similar to the ICS-NMR experiment, which is used to map protein-protein or protein-nucleic acid contact surfaces in complexes. In this work, we present a complete relaxation and conformational exchange matrix (CORCEMA) theory (H. N. B. Moseley et al., J. Magn. Reson. B 108, 243-261 (1995)) applicable for these two closely related experiments. As in our previous work, we show that when exchange is fast on the relaxation rate scale, a simplified CORCEMA theory can be formulated using a generalized average relaxation rate matrix. Its range of validity is established by comparing its predictions with those of the exact CORCEMA theory which is valid for all exchange rates. Using some ideal model systems we have analyzed the factors that influence the ligand proton intensity changes when the resonances from some protons on the receptor protein are saturated. The results show that the intensity changes in the ligand signals in an intermolecular NOE experiment are very much dependent upon: (1) the saturation time, (2) the location of the saturated receptor protons with respect to the ligand protons, (3) the conformation of the ligand-receptor interface, (4) the rotational correlation times for the molecular species, (5) the kinetics of the reversibly forming complex, and (6) the ligand/receptor ratio. As an example of a typical application of the STD-NMR experiment we have also simulated the STD effects for a

  1. Palm Oil Consumption Increases LDL Cholesterol Compared with Vegetable Oils Low in Saturated Fat in a Meta-Analysis of Clinical Trials.

    Science.gov (United States)

    Sun, Ye; Neelakantan, Nithya; Wu, Yi; Lote-Oke, Rashmi; Pan, An; van Dam, Rob M

    2015-07-01

    Palm oil contains a high amount of saturated fat compared with most other vegetable oils, but studies have reported inconsistent effects of palm oil on blood lipids. We systematically reviewed the effect of palm oil consumption on blood lipids compared with other cooking oils using data from clinical trials. We searched PubMed and the Cochrane Library for trials of at least 2 wk duration that compared the effects of palm oil consumption with any of the predefined comparison oils: vegetable oils low in saturated fat, trans fat-containing partially hydrogenated vegetable oils, and animal fats. Data were pooled by using random-effects meta-analysis. Palm oil significantly increased LDL cholesterol by 0.24 mmol/L (95% CI: 0.13, 0.35 mmol/L; I(2) = 83.2%) compared with vegetable oils low in saturated fat. This effect was observed in randomized trials (0.31 mmol/L; 95% CI: 0.20, 0.42 mmol/L) but not in nonrandomized trials (0.03 mmol/L; 95% CI: -0.15, 0.20 mmol/L; P-difference = 0.02). Among randomized trials, only modest heterogeneity in study results remained after considering the test oil dose and the comparison oil type (I(2) = 27.5%). Palm oil increased HDL cholesterol by 0.02 mmol/L (95% CI: 0.01, 0.04 mmol/L; I(2) = 49.8%) compared with vegetable oils low in saturated fat and by 0.09 mmol/L (95% CI: 0.06, 0.11 mmol/L; I(2) = 47.8%) compared with trans fat-containing oils. Palm oil consumption results in higher LDL cholesterol than do vegetable oils low in saturated fat and higher HDL cholesterol than do trans fat-containing oils in humans. The effects of palm oil on blood lipids are as expected on the basis of its high saturated fat content, which supports the reduction in palm oil use by replacement with vegetable oils low in saturated and trans fat. This systematic review was registered with the PROSPERO registry at http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42012002601#.VU3wvSGeDRZ as CRD42012002601. © 2015 American Society for Nutrition.

  2. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  3. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    Science.gov (United States)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  4. High-throughput analysis of candidate imprinted genes and allele-specific gene expression in the human term placenta

    Directory of Open Access Journals (Sweden)

    Clark Taane G

    2010-04-01

    Full Text Available Abstract Background Imprinted genes show expression from one parental allele only and are important for development and behaviour. This extreme mode of allelic imbalance has been described for approximately 56 human genes. Imprinting status is often disrupted in cancer and dysmorphic syndromes. More subtle variation of gene expression, that is not parent-of-origin specific, termed 'allele-specific gene expression' (ASE is more common and may give rise to milder phenotypic differences. Using two allele-specific high-throughput technologies alongside bioinformatics predictions, normal term human placenta was screened to find new imprinted genes and to ascertain the extent of ASE in this tissue. Results Twenty-three family trios of placental cDNA, placental genomic DNA (gDNA and gDNA from both parents were tested for 130 candidate genes with the Sequenom MassArray system. Six genes were found differentially expressed but none imprinted. The Illumina ASE BeadArray platform was then used to test 1536 SNPs in 932 genes. The array was enriched for the human orthologues of 124 mouse candidate genes from bioinformatics predictions and 10 human candidate imprinted genes from EST database mining. After quality control pruning, a total of 261 informative SNPs (214 genes remained for analysis. Imprinting with maternal expression was demonstrated for the lymphocyte imprinted gene ZNF331 in human placenta. Two potential differentially methylated regions (DMRs were found in the vicinity of ZNF331. None of the bioinformatically predicted candidates tested showed imprinting except for a skewed allelic expression in a parent-specific manner observed for PHACTR2, a neighbour of the imprinted PLAGL1 gene. ASE was detected for two or more individuals in 39 candidate genes (18%. Conclusions Both Sequenom and Illumina assays were sensitive enough to study imprinting and strong allelic bias. Previous bioinformatics approaches were not predictive of new imprinted genes

  5. Laboratory analysis of fluid flow and solute transport through a variably saturated fracture embedded in porous tuff

    International Nuclear Information System (INIS)

    Chuang, Y.; Haldeman, W.R.; Rasmussen, T.C.; Evans, D.D.

    1990-02-01

    Laboratory techniques are developed that allow concurrent measurement of unsaturated matrix hydraulic conductivity and fracture transmissivity of fractured rock blocks. Two Apache Leap tuff blocks with natural fractures were removed from near Superior, Arizona, shaped into rectangular prisms, and instrumented in the laboratory. Porous ceramic plates provided solution to block tops at regulated pressures. Infiltration tests were performed on both test blocks. Steady flow testing of the saturated first block provided estimates of matrix hydraulic conductivity and fracture transmissivity. Fifteen centimeters of suction applied to the second block top showed that fracture flow was minimal and matrix hydraulic conductivity was an order of magnitude less than the first block saturated matrix conductivity. Coated-wire ion-selective electrodes monitored aqueous chlorided breakthrough concentrations. Minute samples of tracer solution were collected with filter paper. The techniques worked well for studying transport behavior at near-saturated flow conditions and also appear to be promising for unsaturated conditions. Breakthrough curves in the fracture and matrix, and a concentration map of chloride concentrations within the fracture, suggest preferential flows paths in the fracture and substantial diffusion into the matrix. Average travel velocity, dispersion coefficient and longitudinal dispersivity in the fracture are obtained. 67 refs., 54 figs., 23 tabs

  6. Accurate CpG and non-CpG cytosine methylation analysis by high-throughput locus-specific pyrosequencing in plants.

    Science.gov (United States)

    How-Kit, Alexandre; Daunay, Antoine; Mazaleyrat, Nicolas; Busato, Florence; Daviaud, Christian; Teyssier, Emeline; Deleuze, Jean-François; Gallusci, Philippe; Tost, Jörg

    2015-07-01

    Pyrosequencing permits accurate quantification of DNA methylation of specific regions where the proportions of the C/T polymorphism induced by sodium bisulfite treatment of DNA reflects the DNA methylation level. The commercially available high-throughput locus-specific pyrosequencing instruments allow for the simultaneous analysis of 96 samples, but restrict the DNA methylation analysis to CpG dinucleotide sites, which can be limiting in many biological systems. In contrast to mammals where DNA methylation occurs nearly exclusively on CpG dinucleotides, plants genomes harbor DNA methylation also in other sequence contexts including CHG and CHH motives, which cannot be evaluated by these pyrosequencing instruments due to software limitations. Here, we present a complete pipeline for accurate CpG and non-CpG cytosine methylation analysis at single base-resolution using high-throughput locus-specific pyrosequencing. The devised approach includes the design and validation of PCR amplification on bisulfite-treated DNA and pyrosequencing assays as well as the quantification of the methylation level at every cytosine from the raw peak intensities of the Pyrograms by two newly developed Visual Basic Applications. Our method presents accurate and reproducible results as exemplified by the cytosine methylation analysis of the promoter regions of two Tomato genes (NOR and CNR) encoding transcription regulators of fruit ripening during different stages of fruit development. Our results confirmed a significant and temporally coordinated loss of DNA methylation on specific cytosines during the early stages of fruit development in both promoters as previously shown by WGBS. The manuscript describes thus the first high-throughput locus-specific DNA methylation analysis in plants using pyrosequencing.

  7. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  8. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  9. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    International Nuclear Information System (INIS)

    Hui Su

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm(sub 2) for 40-(micro)m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection

  10. IEEE 802.11e (EDCA analysis in the presence of hidden stations

    Directory of Open Access Journals (Sweden)

    Xijie Liu

    2011-07-01

    Full Text Available The key contribution of this paper is the combined analytical analysis of both saturated and non-saturated throughput of IEEE 802.11e networks in the presence of hidden stations. This approach is an extension to earlier works by other authors which provided Markov chain analysis to the IEEE 802.11 family under various assumptions. Our approach also modifies earlier expressions for the probability that a station transmits a packet in a vulnerable period. The numerical results provide the impact of the access categories on the channel throughput. Various throughput results under different mechanisms are presented.

  11. Comparative analysis of the apparent saturation hysteresis approach and the domain theory of hysteresis in respect of prediction of scanning curves and air entrapment

    Science.gov (United States)

    Beriozkin, A.; Mualem, Y.

    2018-05-01

    This study theoretically analyzes the concept of apparent saturation hysteresis, combined with the Scott et al. (1983) scaling approach, as suggested by Parker and Lenhard (1987), to account for the effect of air entrapment and release on the soil water hysteresis. We found that the theory of Parker and Lenhard (1987) is comprised of some mutually canceling mathematical operations, and when cleared of the superfluous intermediate calculations, their model reduces to the original Scott et al.'s (1983) scaling method, supplemented with the requirement of closure of scanning loops. Our analysis reveals that actually there is no effect of their technique of accounting for the entrapped air on the final prediction of the effective saturation (or water content) scanning curves. Our consideration indicates that the use of the Land (1968) formula for assessing the amount of entrapped air is in disaccord with the apparent saturation concept as introduced by Parker and Lenhard (1987). In this paper, a proper routine is suggested for predicting hysteretic scanning curves of any order, given the two measured main curves, in the complete hysteretic domain and some verification tests are carried out versus measured results. Accordingly, explicit closed-form formulae for direct prediction (with no need of intermediate calculation) of scanning curves up to the third order are derived to sustain our analysis.

  12. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    Science.gov (United States)

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  13. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  14. Development of finite element code for the analysis of coupled thermo-hydro-mechanical behaviors of saturated-unsaturated medium

    International Nuclear Information System (INIS)

    Ohnishi, Y.; Shibata, H.; Kobayashi, A.

    1985-01-01

    A model is presented which describes fully coupled thermo-hydro-mechanical behavior of porous geologic medium. The mathematical formulation for the model utilizes the Biot theory for the consolidation and the energy balance equation. The medium is in the condition of saturated-unsaturated flow, then the free surfaces are taken into consideration in the model. The model, incorporated in a finite element numerical procedure, was implemented in a two-dimensional computer code. The code was developed under the assumptions that the medium is poro-elastic and in plane strain condition; water in the ground does not change its phase; heat is transferred by conductive and convective flow. Analytical solutions pertaining to consolidation theory for soils and rocks, thermoelasticity for solids and hydrothermal convection theory provided verification of stress and fluid flow couplings, respectively in the coupled model. Several types of problems are analyzed. The one is a study of some of the effects of completely coupled thermo-hydro-mechanical behavior on the response of a saturated-unsaturated porous rock containing a buried heat source. Excavation of an underground opening which has radioactive wastes at elevated temperatures is modeled and analyzed. The results shows that the coupling phenomena can be estimated at some degree by the numerical procedure. The computer code has a powerful ability to analyze of the repository the complex nature of the repository

  15. The error analysis of the reverse saturation current of the diode in the modeling of photovoltaic modules

    International Nuclear Information System (INIS)

    Wang, Gang; Zhao, Ke; Qiu, Tian; Yang, Xinsheng; Zhang, Yong; Zhao, Yong

    2016-01-01

    In the modeling and simulation of photovoltaic modules, especially in calculating the reverse saturation current of the diode, the series and parallel resistances are often neglected, causing certain errors. We analyzed the errors at the open circuit point, and proposed an iterative algorithm to calculate the modified values of the reverse saturation current, series resistance and parallel resistance of the diode, in order to reduce the errors. Assuming independent irradiation and temperature effects, the irradiation-dependence and the temperature-dependence of the open circuit voltage were introduced to obtain the modified formula of the open circuit voltage under any condition. Experimental results show that this modified formula has high accuracy, even at irradiance as low as 40 W/m"2. The errors of open circuit voltage were significantly reduced, indicating that this modified model is suitable for simulations of photovoltaic modules. - Highlights: • We propose a new method for modeling PV modules with higher accuracy. • The errors of open circuit voltage are significantly reduced. • I_o under any condition is calculated.

  16. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  17. Product Chemistry and Process Efficiency of Biomass Torrefaction, Pyrolysis and Gasification Studied by High-Throughput Techniques and Multivariate Analysis

    Science.gov (United States)

    Xiao, Li

    Despite the great passion and endless efforts on development of renewable energy from biomass, the commercialization and scale up of biofuel production is still under pressure and facing challenges. New ideas and facilities are being tested around the world targeting at reducing cost and improving product value. Cutting edge technologies involving analytical chemistry, statistics analysis, industrial engineering, computer simulation, and mathematics modeling, etc. keep integrating modern elements into this classic research. One of those challenges of commercializing biofuel production is the complexity from chemical composition of biomass feedstock and the products. Because of this, feedstock selection and process optimization cannot be conducted efficiently. This dissertation attempts to further evaluate biomass thermal decomposition process using both traditional methods and advanced technique (Pyrolysis Molecular Beam Mass Spectrometry). Focus has been made on data base generation of thermal decomposition products from biomass at different temperatures, finding out the relationship between traditional methods and advanced techniques, evaluating process efficiency and optimizing reaction conditions, comparison of typically utilized biomass feedstock and new search on innovative species for economical viable feedstock preparation concepts, etc. Lab scale quartz tube reactors and 80il stainless steel sample cups coupled with auto-sampling system were utilized to simulate the complicated reactions happened in real fluidized or entrained flow reactors. Two main high throughput analytical techniques used are Near Infrared Spectroscopy (NIR) and Pyrolysis Molecular Beam Mass Spectrometry (Py-MBMS). Mass balance, carbon balance, and product distribution are presented in detail. Variations of thermal decomposition temperature range from 200°C to 950°C. Feedstocks used in the study involve typical hardwood and softwood (red oak, white oak, yellow poplar, loblolly pine

  18. Multiplexed ChIP-Seq Using Direct Nucleosome Barcoding: A Tool for High-Throughput Chromatin Analysis.

    Science.gov (United States)

    Chabbert, Christophe D; Adjalley, Sophie H; Steinmetz, Lars M; Pelechano, Vicent

    2018-01-01

    Chromatin immunoprecipitation followed by sequencing (ChIP-Seq) or microarray hybridization (ChIP-on-chip) are standard methods for the study of transcription factor binding sites and histone chemical modifications. However, these approaches only allow profiling of a single factor or protein modification at a time.In this chapter, we present Bar-ChIP, a higher throughput version of ChIP-Seq that relies on the direct ligation of molecular barcodes to chromatin fragments. Bar-ChIP enables the concurrent profiling of multiple DNA-protein interactions and is therefore amenable to experimental scale-up, without the need for any robotic instrumentation.

  19. Saturated Zone Colloid Transport

    Energy Technology Data Exchange (ETDEWEB)

    H. S. Viswanathan

    2004-10-07

    This scientific analysis provides retardation factors for colloids transporting in the saturated zone (SZ) and the unsaturated zone (UZ). These retardation factors represent the reversible chemical and physical filtration of colloids in the SZ. The value of the colloid retardation factor, R{sub col} is dependent on several factors, such as colloid size, colloid type, and geochemical conditions (e.g., pH, Eh, and ionic strength). These factors are folded into the distributions of R{sub col} that have been developed from field and experimental data collected under varying geochemical conditions with different colloid types and sizes. Attachment rate constants, k{sub att}, and detachment rate constants, k{sub det}, of colloids to the fracture surface have been measured for the fractured volcanics, and separate R{sub col} uncertainty distributions have been developed for attachment and detachment to clastic material and mineral grains in the alluvium. Radionuclides such as plutonium and americium sorb mostly (90 to 99 percent) irreversibly to colloids (BSC 2004 [DIRS 170025], Section 6.3.3.2). The colloid retardation factors developed in this analysis are needed to simulate the transport of radionuclides that are irreversibly sorbed onto colloids; this transport is discussed in the model report ''Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]). Although it is not exclusive to any particular radionuclide release scenario, this scientific analysis especially addresses those scenarios pertaining to evidence from waste-degradation experiments, which indicate that plutonium and americium may be irreversibly attached to colloids for the time scales of interest. A section of this report will also discuss the validity of using microspheres as analogs to colloids in some of the lab and field experiments used to obtain the colloid retardation factors. In addition, a small fraction of colloids travels with the groundwater without any significant

  20. Healthcare Costs Associated with an Adequate Intake of Sugars, Salt and Saturated Fat in Germany: A Health Econometrical Analysis.

    Directory of Open Access Journals (Sweden)

    Toni Meier

    Full Text Available Non-communicable diseases (NCDs represent not only the major driver for quality-restricted and lost life years; NCDs and their related medical treatment costs also pose a substantial economic burden on healthcare and intra-generational tax distribution systems. The main objective of this study was therefore to quantify the economic burden of unbalanced nutrition in Germany--in particular the effects of an excessive consumption of fat, salt and sugar--and to examine different reduction scenarios on this basis. In this study, the avoidable direct cost savings in the German healthcare system attributable to an adequate intake of saturated fatty acids (SFA, salt and sugar (mono- & disaccharides, MDS were calculated. To this end, disease-specific healthcare cost data from the official Federal Health Monitoring for the years 2002-2008 and disease-related risk factors, obtained by thoroughly searching the literature, were used. A total of 22 clinical endpoints with 48 risk-outcome pairs were considered. Direct healthcare costs attributable to an unbalanced intake of fat, salt and sugar are calculated to be 16.8 billion EUR (CI95%: 6.3-24.1 billion EUR in the year 2008, which represents 7% (CI95% 2%-10% of the total treatment costs in Germany (254 billion EUR. This is equal to 205 EUR per person annually. The excessive consumption of sugar poses the highest burden, at 8.6 billion EUR (CI95%: 3.0-12.1; salt ranks 2nd at 5.3 billion EUR (CI95%: 3.2-7.3 and saturated fat ranks 3rd at 2.9 billion EUR (CI95%: 32 million-4.7 billion. Predicted direct healthcare cost savings by means of a balanced intake of sugars, salt and saturated fat are substantial. However, as this study solely considered direct medical treatment costs regarding an adequate consumption of fat, salt and sugars, the actual societal and economic gains, resulting both from direct and indirect cost savings, may easily exceed 16.8 billion EUR.

  1. Healthcare Costs Associated with an Adequate Intake of Sugars, Salt and Saturated Fat in Germany: A Health Econometrical Analysis.

    Science.gov (United States)

    Meier, Toni; Senftleben, Karolin; Deumelandt, Peter; Christen, Olaf; Riedel, Katja; Langer, Martin

    2015-01-01

    Non-communicable diseases (NCDs) represent not only the major driver for quality-restricted and lost life years; NCDs and their related medical treatment costs also pose a substantial economic burden on healthcare and intra-generational tax distribution systems. The main objective of this study was therefore to quantify the economic burden of unbalanced nutrition in Germany--in particular the effects of an excessive consumption of fat, salt and sugar--and to examine different reduction scenarios on this basis. In this study, the avoidable direct cost savings in the German healthcare system attributable to an adequate intake of saturated fatty acids (SFA), salt and sugar (mono- & disaccharides, MDS) were calculated. To this end, disease-specific healthcare cost data from the official Federal Health Monitoring for the years 2002-2008 and disease-related risk factors, obtained by thoroughly searching the literature, were used. A total of 22 clinical endpoints with 48 risk-outcome pairs were considered. Direct healthcare costs attributable to an unbalanced intake of fat, salt and sugar are calculated to be 16.8 billion EUR (CI95%: 6.3-24.1 billion EUR) in the year 2008, which represents 7% (CI95% 2%-10%) of the total treatment costs in Germany (254 billion EUR). This is equal to 205 EUR per person annually. The excessive consumption of sugar poses the highest burden, at 8.6 billion EUR (CI95%: 3.0-12.1); salt ranks 2nd at 5.3 billion EUR (CI95%: 3.2-7.3) and saturated fat ranks 3rd at 2.9 billion EUR (CI95%: 32 million-4.7 billion). Predicted direct healthcare cost savings by means of a balanced intake of sugars, salt and saturated fat are substantial. However, as this study solely considered direct medical treatment costs regarding an adequate consumption of fat, salt and sugars, the actual societal and economic gains, resulting both from direct and indirect cost savings, may easily exceed 16.8 billion EUR.

  2. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    Science.gov (United States)

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  3. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  4. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    Science.gov (United States)

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  5. Parallel workflow for high-throughput (>1,000 samples/day quantitative analysis of human insulin-like growth factor 1 using mass spectrometric immunoassay.

    Directory of Open Access Journals (Sweden)

    Paul E Oran

    Full Text Available Insulin-like growth factor 1 (IGF1 is an important biomarker for the management of growth hormone disorders. Recently there has been rising interest in deploying mass spectrometric (MS methods of detection for measuring IGF1. However, widespread clinical adoption of any MS-based IGF1 assay will require increased throughput and speed to justify the costs of analyses, and robust industrial platforms that are reproducible across laboratories. Presented here is an MS-based quantitative IGF1 assay with performance rating of >1,000 samples/day, and a capability of quantifying IGF1 point mutations and posttranslational modifications. The throughput of the IGF1 mass spectrometric immunoassay (MSIA benefited from a simplified sample preparation step, IGF1 immunocapture in a tip format, and high-throughput MALDI-TOF MS analysis. The Limit of Detection and Limit of Quantification of the resulting assay were 1.5 μg/L and 5 μg/L, respectively, with intra- and inter-assay precision CVs of less than 10%, and good linearity and recovery characteristics. The IGF1 MSIA was benchmarked against commercially available IGF1 ELISA via Bland-Altman method comparison test, resulting in a slight positive bias of 16%. The IGF1 MSIA was employed in an optimized parallel workflow utilizing two pipetting robots and MALDI-TOF-MS instruments synced into one-hour phases of sample preparation, extraction and MSIA pipette tip elution, MS data collection, and data processing. Using this workflow, high-throughput IGF1 quantification of 1,054 human samples was achieved in approximately 9 hours. This rate of assaying is a significant improvement over existing MS-based IGF1 assays, and is on par with that of the enzyme-based immunoassays. Furthermore, a mutation was detected in ∼1% of the samples (SNP: rs17884626, creating an A→T substitution at position 67 of the IGF1, demonstrating the capability of IGF1 MSIA to detect point mutations and posttranslational modifications.

  6. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    2016-01-01

    Biogas production is an economically attractive technology that has gained momentum worldwide over the past years. Biogas is produced by a biologically mediated process, widely known as "anaerobic digestion." This process is performed by a specialized and complex microbial community, in which...... performed using >400 proteins revealed that the biogas community is a trove of new species. A new approach based on functional properties as per network representation was developed to assign roles to the microbial species. The organization of the anaerobic digestion microbiome is resembled by a funnel...... on the phylogenetic and functional characterization of the microbial community populating biogas reactors. By applying for the first time high-throughput sequencing and a novel binning strategy, the identified genes were anchored to single genomes providing a clear understanding of their metabolic pathways...

  7. Human Leukocyte Antigen Typing Using a Knowledge Base Coupled with a High-Throughput Oligonucleotide Probe Array Analysis

    Science.gov (United States)

    Zhang, Guang Lan; Keskin, Derin B.; Lin, Hsin-Nan; Lin, Hong Huang; DeLuca, David S.; Leppanen, Scott; Milford, Edgar L.; Reinherz, Ellis L.; Brusic, Vladimir

    2014-01-01

    Human leukocyte antigens (HLA) are important biomarkers because multiple diseases, drug toxicity, and vaccine responses reveal strong HLA associations. Current clinical HLA typing is an elimination process requiring serial testing. We present an alternative in situ synthesized DNA-based microarray method that contains hundreds of thousands of probes representing a complete overlapping set covering 1,610 clinically relevant HLA class I alleles accompanied by computational tools for assigning HLA type to 4-digit resolution. Our proof-of-concept experiment included 21 blood samples, 18 cell lines, and multiple controls. The method is accurate, robust, and amenable to automation. Typing errors were restricted to homozygous samples or those with very closely related alleles from the same locus, but readily resolved by targeted DNA sequencing validation of flagged samples. High-throughput HLA typing technologies that are effective, yet inexpensive, can be used to analyze the world’s populations, benefiting both global public health and personalized health care. PMID:25505899

  8. Extensive impact of saturated fatty acids on metabolic and cardiovascular profile in rats with diet-induced obesity: a canonical analysis.

    Science.gov (United States)

    Oliveira Junior, Silvio A; Padovani, Carlos R; Rodrigues, Sergio A; Silva, Nilza R; Martinez, Paula F; Campos, Dijon Hs; Okoshi, Marina P; Okoshi, Katashi; Dal-Pai, Maeli; Cicogna, Antonio C

    2013-04-15

    Although hypercaloric interventions are associated with nutritional, endocrine, metabolic, and cardiovascular disorders in obesity experiments, a rational distinction between the effects of excess adiposity and the individual roles of dietary macronutrients in relation to these disturbances has not previously been studied. This investigation analyzed the correlation between ingested macronutrients (including sucrose and saturated and unsaturated fatty acids) plus body adiposity and metabolic, hormonal, and cardiovascular effects in rats with diet-induced obesity. Normotensive Wistar-Kyoto rats were submitted to Control (CD; 3.2 Kcal/g) and Hypercaloric (HD; 4.6 Kcal/g) diets for 20 weeks followed by nutritional evaluation involving body weight and adiposity measurement. Metabolic and hormonal parameters included glycemia, insulin, insulin resistance, and leptin. Cardiovascular analysis included systolic blood pressure profile, echocardiography, morphometric study of myocardial morphology, and myosin heavy chain (MHC) protein expression. Canonical correlation analysis was used to evaluate the relationships between dietary macronutrients plus adiposity and metabolic, hormonal, and cardiovascular parameters. Although final group body weights did not differ, HD presented higher adiposity than CD. Diet induced hyperglycemia while insulin and leptin levels remained unchanged. In a cardiovascular context, systolic blood pressure increased with time only in HD. Additionally, in vivo echocardiography revealed cardiac hypertrophy and improved systolic performance in HD compared to CD; and while cardiomyocyte size was unchanged by diet, nuclear volume and collagen interstitial fraction both increased in HD. Also HD exhibited higher relative β-MHC content and β/α-MHC ratio than their Control counterparts. Importantly, body adiposity was weakly associated with cardiovascular effects, as saturated fatty acid intake was directly associated with most cardiac remodeling

  9. Gluon saturation in a saturated environment

    International Nuclear Information System (INIS)

    Kopeliovich, B. Z.; Potashnikova, I. K.; Schmidt, Ivan

    2011-01-01

    A bootstrap equation for self-quenched gluon shadowing leads to a reduced magnitude of broadening for partons propagating through a nucleus. Saturation of small-x gluons in a nucleus, which has the form of transverse momentum broadening of projectile gluons in pA collisions in the nuclear rest frame, leads to a modification of the parton distribution functions in the beam compared with pp collisions. In nucleus-nucleus collisions all participating nucleons acquire enhanced gluon density at small x, which boosts further the saturation scale. Solution of the reciprocity equations for central collisions of two heavy nuclei demonstrate a significant, up to several times, enhancement of Q sA 2 , in AA compared with pA collisions.

  10. Throughput and Fairness of Collision Avoidance Protocols in Ad Hoc Networks

    National Research Council Canada - National Science Library

    Garcia-Luna-Aceves, J. J; Wang, Yu

    2004-01-01

    .... In Section 1, The authors present an analytical modeling to derive the saturation throughput of these sender-initiated collision avoidance protocols in multi-hop ad hoc networks with nodes randomly...

  11. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  12. High throughput screening of phenoxy carboxylic acids with dispersive solid phase extraction followed by direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Wang, Jiaqin; Zhu, Jun; Si, Ling; Du, Qi; Li, Hongli; Bi, Wentao; Chen, David Da Yong

    2017-12-15

    A high throughput, low environmental impact methodology for rapid determination of phenoxy carboxylic acids (PCAs) in water samples was developed by combing dispersive solid phase extraction (DSPE) using velvet-like graphitic carbon nitride (V-g-C 3 N 4 ) and direct analysis in real time mass spectrometry (DART-MS). Due to the large surface area and good dispersity of V-g-C 3 N 4 , the DSPE of PCAs in water was completed within 20 s, and the elution of PCAs was accomplished in 20 s as well using methanol. The eluents were then analyzed and quantified using DART ionization source coupled to a high resolution mass spectrometer, where an internal standard was added in the samples. The limit of detection ranged from 0.5 ng L -1 to 2 ng L -1 on the basis of 50 mL water sample; the recovery 79.9-119.1%; and the relative standard deviation 0.23%-9.82% (≥5 replicates). With the ease of use and speed of DART-MS, the whole protocol can complete within mere minutes, including sample preparation, extraction, elution, detection and quantitation. The methodology developed here is simple, fast, sensitive, quantitative, requiring little sample preparation and consuming significantly less toxic organic solvent, which can be used for high throughput screening of PCAs and potentially other contaminants in water. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Evaluation of Colloid Retention Site Dominance in Variably Saturated Porous Media: An All Pores Pore-Scale Analysis

    Science.gov (United States)

    Morales, Veronica; Perez-Reche, Francisco; Holzner, Markus; Kinzelbach, Wolfgang

    2016-04-01

    It is well accepted that colloid and nanoparticle transport processes in porous media differ substantially between water saturated and unsaturated conditions. Differences are frequently ascribed to particle immobilization by association with interfaces with the gas, as well as to restrictions of the liquid medium through which colloids are transported. Yet, the current understanding of the importance of particle retention at gas interfaces is based on observations of single pores or two-dimensional pore network representations, leaving open the question of their statistical significance when all pores in the medium are considered. In order to address this question, column experiments were performed using a model porous medium of glass beads through which Silver particles were transported for conditions of varying water content and water chemistry. X-ray microtomography was subsequently employed as a non-destructive imaging technique to obtain pore-scale information of the entire column regarding: i) the presence and distribution of the main locations where colloids can become retained (interfaces with the water-solid, air-water, air-solid, and air-water-solid, grain-grain contacts, and the bulk liquid), ii) deposition profiles of colloids along the column classified by the available retention location, and iii) channel widths of 3-dimensional pore-water network representations. The results presented provide a direct statistical evaluation on the significance of colloid retention by attachment to interfaces or by strainig at contact points where multiple interfaces meet.

  14. Transcriptome-Wide Analysis of Botrytis elliptica Responsive microRNAs and Their Targets in Lilium Regale Wilson by High-Throughput Sequencing and Degradome Analysis

    Directory of Open Access Journals (Sweden)

    Xue Gao

    2017-05-01

    Full Text Available MicroRNAs, as master regulators of gene expression, have been widely identified and play crucial roles in plant-pathogen interactions. A fatal pathogen, Botrytis elliptica, causes the serious folia disease of lily, which reduces production because of the high susceptibility of most cultivated species. However, the miRNAs related to Botrytis infection of lily, and the miRNA-mediated gene regulatory networks providing resistance to B. elliptica in lily remain largely unexplored. To systematically dissect B. elliptica-responsive miRNAs and their target genes, three small RNA libraries were constructed from the leaves of Lilium regale, a promising Chinese wild Lilium species, which had been subjected to mock B. elliptica treatment or B. elliptica infection for 6 and 24 h. By high-throughput sequencing, 71 known miRNAs belonging to 47 conserved families and 24 novel miRNA were identified, of which 18 miRNAs were downreguleted and 13 were upregulated in response to B. elliptica. Moreover, based on the lily mRNA transcriptome, 22 targets for 9 known and 1 novel miRNAs were identified by the degradome sequencing approach. Most target genes for elliptica-responsive miRNAs were involved in metabolic processes, few encoding different transcription factors, including ELONGATION FACTOR 1 ALPHA (EF1a and TEOSINTE BRANCHED1/CYCLOIDEA/PROLIFERATING CELL FACTOR 2 (TCP2. Furthermore, the expression patterns of a set of elliptica-responsive miRNAs and their targets were validated by quantitative real-time PCR. This study represents the first transcriptome-based analysis of miRNAs responsive to B. elliptica and their targets in lily. The results reveal the possible regulatory roles of miRNAs and their targets in B. elliptica interaction, which will extend our understanding of the mechanisms of this disease in lily.

  15. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  16. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Directory of Open Access Journals (Sweden)

    Karolina Chwialkowska

    2017-11-01

    Full Text Available Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq. We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation

  17. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  18. The transport behaviour of elemental mercury DNAPL in saturated porous media: analysis of field observations and two-phase flow modelling.

    Science.gov (United States)

    Sweijen, Thomas; Hartog, Niels; Marsman, Annemieke; Keijzer, Thomas J S

    2014-06-01

    Mercury is a contaminant of global concern. The use of elemental mercury in various (former) industrial processes, such as chlorine production at chlor-alkali plants, is known to have resulted in soil and groundwater contaminations worldwide. However, the subsurface transport behaviour of elemental mercury as an immiscible dense non-aqueous phase liquid (DNAPL) in porous media has received minimal attention to date. Even though, such insight would aid in the remediation effort of mercury contaminated sites. Therefore, in this study a detailed field characterization of elemental mercury DNAPL distribution with depth was performed together with two-phase flow modelling, using STOMP. This is to evaluate the dynamics of mercury DNAPL migration and the controls on its distribution in saturated porous media. Using a CPT-probe mounted with a digital camera, in-situ mercury DNAPL depth distribution was obtained at a former chlor-alkali-plant, down to 9 m below ground surface. Images revealing the presence of silvery mercury DNAPL droplets were used to quantify its distribution, characteristics and saturation, using an image analysis method. These field-observations with depth were compared with results from a one-dimensional two-phase flow model simulation for the same transect. Considering the limitations of this approach, simulations reasonably reflected the variability and range of the mercury DNAPL distribution. To further explore the impact of mercury's physical properties in comparison with more common DNAPLs, the migration of mercury and PCE DNAPL in several typical hydrological scenarios was simulated. Comparison of the simulations suggest that mercury's higher density is the overall controlling factor in controlling its penetration in saturated porous media, despite its higher resistance to flow due to its higher viscosity. Based on these results the hazard of spilled mercury DNAPL to cause deep contamination of groundwater systems seems larger than for any other

  19. Analysis of sources of bulk conductivity change in saturated silica sand after unbuffered TCE oxidation by permanganate.

    Science.gov (United States)

    Hort, Ryan D; Revil, André; Munakata-Marr, Junko

    2014-09-01

    Time lapse resistivity surveys could potentially improve monitoring of permanganate-based in situ chemical oxidation (ISCO) of organic contaminants such as trichloroethene (TCE) by tracking changes in subsurface conductivity that result from injection of permanganate and oxidation of the contaminant. Bulk conductivity and pore fluid conductivity changes during unbuffered TCE oxidation using permanganate are examined through laboratory measurements and conductivity modeling using PHREEQC in fluid samples and porous media samples containing silica sand. In fluid samples, oxidation of one TCE molecule produces three chloride ions and one proton, resulting in an increase in fluid electrical conductivity despite the loss of two permanganate ions in the reaction. However, in saturated sand samples in which up to 8mM TCE was oxidized, at least 94% of the fluid conductivity associated with the presence of protons was removed within 3h of sand contact, most likely through protonation of silanol groups found on the surface of the sand grains. Minor conductivity effects most likely associated with pH-dependent reductive dissolution of manganese dioxide were also observed but not accounted for in pore-fluid conductivity modeling. Unaccounted conductivity effects resulted in an under-calculation of post-reaction pore fluid conductivity of 2.1% to 5.5%. Although small increases in the porous media formation factor resulting from precipitation of manganese dioxide were detected (about 3%), these increases could not be confirmed to be statistically significant. Both injection of permanganate and oxidation of TCE cause increases in bulk conductivity that would be detectable through time-lapse resistivity surveys in field conditions. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. High throughput and accurate serum proteome profiling by integrated sample preparation technology and single-run data independent mass spectrometry analysis.

    Science.gov (United States)

    Lin, Lin; Zheng, Jiaxin; Yu, Quan; Chen, Wendong; Xing, Jinchun; Chen, Chenxi; Tian, Ruijun

    2018-03-01

    Mass spectrometry (MS)-based serum proteome analysis is extremely challenging due to its high complexity and dynamic range of protein abundances. Developing high throughput and accurate serum proteomic profiling approach capable of analyzing large cohorts is urgently needed for biomarker discovery. Herein, we report a streamlined workflow for fast and accurate proteomic profiling from 1μL of blood serum. The workflow combined an integrated technique for highly sensitive and reproducible sample preparation and a new data-independent acquisition (DIA)-based MS method. Comparing with standard data dependent acquisition (DDA) approach, the optimized DIA method doubled the number of detected peptides and proteins with better reproducibility. Without protein immunodepletion and prefractionation, the single-run DIA analysis enables quantitative profiling of over 300 proteins with 50min gradient time. The quantified proteins span more than five orders of magnitude of abundance range and contain over 50 FDA-approved disease markers. The workflow allowed us to analyze 20 serum samples per day, with about 358 protein groups per sample being identified. A proof-of-concept study on renal cell carcinoma (RCC) serum samples confirmed the feasibility of the workflow for large scale serum proteomic profiling and disease-related biomarker discovery. Blood serum or plasma is the predominant specimen for clinical proteomic studies while the analysis is extremely challenging for its high complexity. Many efforts had been made in the past for serum proteomics for maximizing protein identifications, whereas few have been concerned with throughput and reproducibility. Here, we establish a rapid, robust and high reproducible DIA-based workflow for streamlined serum proteomic profiling from 1μL serum. The workflow doesn't need protein depletion and pre-fractionation, while still being able to detect disease-relevant proteins accurately. The workflow is promising in clinical application

  1. Uplink SDMA with Limited Feedback: Throughput Scaling

    Directory of Open Access Journals (Sweden)

    Jeffrey G. Andrews

    2008-01-01

    Full Text Available Combined space division multiple access (SDMA and scheduling exploit both spatial multiplexing and multiuser diversity, increasing throughput significantly. Both SDMA and scheduling require feedback of multiuser channel sate information (CSI. This paper focuses on uplink SDMA with limited feedback, which refers to efficient techniques for CSI quantization and feedback. To quantify the throughput of uplink SDMA and derive design guidelines, the throughput scaling with system parameters is analyzed. The specific parameters considered include the numbers of users, antennas, and feedback bits. Furthermore, different SNR regimes and beamforming methods are considered. The derived throughput scaling laws are observed to change for different SNR regimes. For instance, the throughput scales logarithmically with the number of users in the high SNR regime but double logarithmically in the low SNR regime. The analysis of throughput scaling suggests guidelines for scheduling in uplink SDMA. For example, to maximize throughput scaling, scheduling should use the criterion of minimum quantization errors for the high SNR regime and maximum channel power for the low SNR regime.

  2. Gold nanoparticle-mediated (GNOME) laser perforation: a new method for a high-throughput analysis of gap junction intercellular coupling.

    Science.gov (United States)

    Begandt, Daniela; Bader, Almke; Antonopoulos, Georgios C; Schomaker, Markus; Kalies, Stefan; Meyer, Heiko; Ripken, Tammo; Ngezahayo, Anaclet

    2015-10-01

    The present report evaluates the advantages of using the gold nanoparticle-mediated laser perforation (GNOME LP) technique as a computer-controlled cell optoperforation to introduce Lucifer yellow (LY) into cells in order to analyze the gap junction coupling in cell monolayers. To permeabilize GM-7373 endothelial cells grown in a 24 multiwell plate with GNOME LP, a laser beam of 88 μm in diameter was applied in the presence of gold nanoparticles and LY. After 10 min to allow dye uptake and diffusion through gap junctions, we observed a LY-positive cell band of 179 ± 8 μm width. The presence of the gap junction channel blocker carbenoxolone during the optoperforation reduced the LY-positive band to 95 ± 6 μm. Additionally, a forskolin-related enhancement of gap junction coupling, recently found using the scrape loading technique, was also observed using GNOME LP. Further, an automatic cell imaging and a subsequent semi-automatic quantification of the images using a java-based ImageJ-plugin were performed in a high-throughput sequence. Moreover, the GNOME LP was used on cells such as RBE4 rat brain endothelial cells, which cannot be mechanically scraped as well as on three-dimensionally cultivated cells, opening the possibility to implement the GNOME LP technique for analysis of gap junction coupling in tissues. We conclude that the GNOME LP technique allows a high-throughput automated analysis of gap junction coupling in cells. Moreover this non-invasive technique could be used on monolayers that do not support mechanical scraping as well as on cells in tissue allowing an in vivo/ex vivo analysis of gap junction coupling.

  3. Analysis of the laminar Newtonian fluid flow through a thin fracture modelled as a fluid-saturated sparsely packed porous medium

    Energy Technology Data Exchange (ETDEWEB)

    Pazanin, Igor [Zagreb Univ. (Croatia). Dept. of Mathematics; Siddheshwar, Pradeep G. [Bangalore Univ., Bengaluru (India). Dept. of Mathematics

    2017-06-01

    In this article we investigate the fluid flow through a thin fracture modelled as a fluid-saturated porous medium. We assume that the fracture has constrictions and that the flow is governed by the prescribed pressure drop between the edges of the fracture. The problem is described by the Darcy-Lapwood-Brinkman model acknowledging the Brinkman extension of the Darcy law as well as the flow inertia. Using asymptotic analysis with respect to the thickness of the fracture, we derive the explicit higher-order approximation for the velocity distribution. We make an error analysis to comment on the order of accuracy of the method used and also to provide rigorous justification for the model.

  4. Rapid evaporative ionization mass spectrometry for high-throughput screening in food analysis: The case of boar taint.

    Science.gov (United States)

    Verplanken, Kaat; Stead, Sara; Jandova, Renata; Poucke, Christof Van; Claereboudt, Jan; Bussche, Julie Vanden; Saeger, Sarah De; Takats, Zoltan; Wauters, Jella; Vanhaecke, Lynn

    2017-07-01

    Boar taint is a contemporary off-odor present in meat of uncastrated male pigs. As European Member States intend to abandon surgical castration of pigs by 2018, this off-odor has gained a lot of research interest. In this study, rapid evaporative ionization mass spectrometry (REIMS) was explored for the rapid detection of boar taint in neck fat. Untargeted screening of samples (n=150) enabled discrimination between sow, tainted and untainted boars. The obtained OPLS-DA models showed excellent classification accuracy, i.e. 99% and 100% for sow and boar samples or solely boar samples, respectively. Furthermore, the obtained models demonstrated excellent validation characteristics (R 2 (Y)=0.872-0.969; Q 2 (Y)=0.756-0.917), which were confirmed by CV-ANOVA (phighly accurate and high-throughput (<10s) classification of tainted and untainted boar samples was achieved, rendering REIMS a promising technique for predictive modelling in food safety and quality applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. miRanalyzer: an update on the detection and analysis of microRNAs in high-throughput sequencing experiments

    Science.gov (United States)

    Hackenberg, Michael; Rodríguez-Ezpeleta, Naiara; Aransay, Ana M.

    2011-01-01

    We present a new version of miRanalyzer, a web server and stand-alone tool for the detection of known and prediction of new microRNAs in high-throughput sequencing experiments. The new version has been notably improved regarding speed, scope and available features. Alignments are now based on the ultrafast short-read aligner Bowtie (granting also colour space support, allowing mismatches and improving speed) and 31 genomes, including 6 plant genomes, can now be analysed (previous version contained only 7). Differences between plant and animal microRNAs have been taken into account for the prediction models and differential expression of both, known and predicted microRNAs, between two conditions can be calculated. Additionally, consensus sequences of predicted mature and precursor microRNAs can be obtained from multiple samples, which increases the reliability of the predicted microRNAs. Finally, a stand-alone version of the miRanalyzer that is based on a local and easily customized database is also available; this allows the user to have more control on certain parameters as well as to use specific data such as unpublished assemblies or other libraries that are not available in the web server. miRanalyzer is available at http://bioinfo2.ugr.es/miRanalyzer/miRanalyzer.php. PMID:21515631

  6. Comparative analysis and validation of the malachite green assay for the high throughput biochemical characterization of terpene synthases.

    Science.gov (United States)

    Vardakou, Maria; Salmon, Melissa; Faraldos, Juan A; O'Maille, Paul E

    2014-01-01

    Terpenes are the largest group of natural products with important and diverse biological roles, while of tremendous economic value as fragrances, flavours and pharmaceutical agents. Class-I terpene synthases (TPSs), the dominant type of TPS enzymes, catalyze the conversion of prenyl diphosphates to often structurally diverse bioactive terpene hydrocarbons, and inorganic pyrophosphate (PPi). To measure their kinetic properties, current bio-analytical methods typically rely on the direct detection of hydrocarbon products by radioactivity measurements or gas chromatography-mass spectrometry (GC-MS). In this study we employed an established, rapid colorimetric assay, the pyrophosphate/malachite green assay (MG), as an alternative means for the biochemical characterization of class I TPSs activity.•We describe the adaptation of the MG assay for turnover and catalytic efficiency measurements of TPSs.•We validate the method by direct comparison with established assays. The agreement of k cat/K M among methods makes this adaptation optimal for rapid evaluation of TPSs.•We demonstrate the application of the MG assay for the high-throughput screening of TPS gene libraries.

  7. High-throughput analysis of sulfatides in cerebrospinal fluid using automated extraction and UPLC-MS/MS.

    Science.gov (United States)

    Blomqvist, Maria; Borén, Jan; Zetterberg, Henrik; Blennow, Kaj; Månsson, Jan-Eric; Ståhlman, Marcus

    2017-07-01

    Sulfatides (STs) are a group of glycosphingolipids that are highly expressed in brain. Due to their importance for normal brain function and their potential involvement in neurological diseases, development of accurate and sensitive methods for their determination is needed. Here we describe a high-throughput oriented and quantitative method for the determination of STs in cerebrospinal fluid (CSF). The STs were extracted using a fully automated liquid/liquid extraction method and quantified using ultra-performance liquid chromatography coupled to tandem mass spectrometry. With the high sensitivity of the developed method, quantification of 20 ST species from only 100 μl of CSF was performed. Validation of the method showed that the STs were extracted with high recovery (90%) and could be determined with low inter- and intra-day variation. Our method was applied to a patient cohort of subjects with an Alzheimer's disease biomarker profile. Although the total ST levels were unaltered compared with an age-matched control group, we show that the ratio of hydroxylated/nonhydroxylated STs was increased in the patient cohort. In conclusion, we believe that the fast, sensitive, and accurate method described in this study is a powerful new tool for the determination of STs in clinical as well as preclinical settings. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.

  8. Patient access in plastic surgery: an operational and financial analysis of service-based interventions to improve ambulatory throughput in an academic surgery practice.

    Science.gov (United States)

    Hultman, Charles Scott; Gilland, Wendell G; Weir, Samuel

    2015-06-01

    Inefficient patient throughput in a surgery practice can result in extended new patient backlogs, excessively long cycle times in the outpatient clinics, poor patient satisfaction, decreased physician productivity, and loss of potential revenue. This project assesses the efficacy of multiple throughput interventions in an academic, plastic surgery practice at a public university. We implemented a Patient Access and Efficiency (PAcE) initiative, funded and sponsored by our health care system, to improve patient throughput in the outpatient surgery clinic. Interventions included: (1) creation of a multidisciplinary team, led by a project redesign manager, that met weekly; (2) definition of goals, metrics, and target outcomes; 3) revision of clinic templates to reflect actual demand; 4) working down patient backlog through group visits; 5) booking new patients across entire practice; 6) assigning a physician's assistant to the preoperative clinic; and 7) designating a central scheduler to coordinate flow of information. Main outcome measures included: patient satisfaction using Press-Ganey surveys; complaints reported to patient relations; time to third available appointment; size of patient backlog; monthly clinic volumes with utilization rates and supply/demand curves; "chaos" rate (cancellations plus reschedules, divided by supply, within 48 hours of booked clinic date); patient cycle times with bottleneck analysis; physician productivity measured by work Relative Value Units (wRVUs); and downstream financial effects on billing, collection, accounts receivable (A/R), and payer mix. We collected, managed, and analyzed the data prospectively, comparing the pre-PAcE period (6 months) with the PAcE period (6 months). The PAcE initiative resulted in multiple improvements across the entire plastic surgery practice. Patient satisfaction increased only slightly from 88.5% to 90.0%, but the quarterly number of complaints notably declined from 17 to 9. Time to third

  9. Synthetic Biomaterials to Rival Nature's Complexity-a Path Forward with Combinatorics, High-Throughput Discovery, and High-Content Analysis.

    Science.gov (United States)

    Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A

    2017-10-01

    Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. High-throughput and sensitive analysis of 3-monochloropropane-1,2-diol fatty acid esters in edible oils by supercritical fluid chromatography/tandem mass spectrometry.

    Science.gov (United States)

    Hori, Katsuhito; Matsubara, Atsuki; Uchikata, Takato; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2012-08-10

    We have established a high-throughput and sensitive analytical method based on supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry (QqQ MS) for 3-monochloropropane-1,2-diol (3-MCPD) fatty acid esters in edible oils. All analytes were successfully separated within 9 min without sample purification. The system was precise and sensitive, with a limit of detection less than 0.063 mg/kg. The recovery rate of 3-MCPD fatty acid esters spiked into oil samples was in the range of 62.68-115.23%. Furthermore, several edible oils were tested for analyzing 3-MCPD fatty acid ester profiles. This is the first report on the analysis of 3-MCPD fatty acid esters by SFC/QqQ MS. The developed method will be a powerful tool for investigating 3-MCPD fatty acid esters in edible oils. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Uncovering leaf rust responsive miRNAs in wheat (Triticum aestivum L.) using high-throughput sequencing and prediction of their targets through degradome analysis.

    Science.gov (United States)

    Kumar, Dhananjay; Dutta, Summi; Singh, Dharmendra; Prabhu, Kumble Vinod; Kumar, Manish; Mukhopadhyay, Kunal

    2017-01-01

    Deep sequencing identified 497 conserved and 559 novel miRNAs in wheat, while degradome analysis revealed 701 targets genes. QRT-PCR demonstrated differential expression of miRNAs during stages of leaf rust progression. Bread wheat (Triticum aestivum L.) is an important cereal food crop feeding 30 % of the world population. Major threat to wheat production is the rust epidemics. This study was targeted towards identification and functional characterizations of micro(mi)RNAs and their target genes in wheat in response to leaf rust ingression. High-throughput sequencing was used for transcriptome-wide identification of miRNAs and their expression profiling in retort to leaf rust using mock and pathogen-inoculated resistant and susceptible near-isogenic wheat plants. A total of 1056 mature miRNAs were identified, of which 497 miRNAs were conserved and 559 miRNAs were novel. The pathogen-inoculated resistant plants manifested more miRNAs compared with the pathogen infected susceptible plants. The miRNA counts increased in susceptible isoline due to leaf rust, conversely, the counts decreased in the resistant isoline in response to pathogenesis illustrating precise spatial tuning of miRNAs during compatible and incompatible interaction. Stem-loop quantitative real-time PCR was used to profile 10 highly differentially expressed miRNAs obtained from high-throughput sequencing data. The spatio-temporal profiling validated the differential expression of miRNAs between the isolines as well as in retort to pathogen infection. Degradome analysis provided 701 predicted target genes associated with defense response, signal transduction, development, metabolism, and transcriptional regulation. The obtained results indicate that wheat isolines employ diverse arrays of miRNAs that modulate their target genes during compatible and incompatible interaction. Our findings contribute to increase knowledge on roles of microRNA in wheat-leaf rust interactions and could help in rust

  12. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  13. Temporal dynamics of soil microbial communities under different moisture regimes: high-throughput sequencing and bioinformatics analysis

    Science.gov (United States)

    Semenov, Mikhail; Zhuravleva, Anna; Semenov, Vyacheslav; Yevdokimov, Ilya; Larionova, Alla

    2017-04-01

    Recent climate scenarios predict not only continued global warming but also an increased frequency and intensity of extreme climatic events such as strong changes in temperature and precipitation regimes. Microorganisms are well known to be more sensitive to changes in environmental conditions than to other soil chemical and physical parameters. In this study, we determined the shifts in soil microbial community structure as well as indicative taxa in soils under three moisture regimes using high-throughput Illumina sequencing and range of bioinformatics approaches for the assessment of sequence data. Incubation experiments were performed in soil-filled (Greyic Phaeozems Albic) rhizoboxes with maize and without plants. Three contrasting moisture regimes were being simulated: 1) optimal wetting (OW), a watering 2-3 times per week to maintain soil moisture of 20-25% by weight; 2) periodic wetting (PW), with alternating periods of wetting and drought; and 3) constant insufficient wetting (IW), while soil moisture of 12% by weight was permanently maintained. Sampled fresh soils were homogenized, and the total DNA of three replicates was extracted using the FastDNA® SPIN kit for Soil. DNA replicates were combined in a pooled sample and the DNA was used for PCR with specific primers for the 16S V3 and V4 regions. In order to compare variability between different samples and replicates within a single sample, some DNA replicates treated separately. The products were purified and submitted to Illumina MiSeq sequencing. Sequence data were evaluated by alpha-diversity (Chao1 and Shannon H' diversity indexes), beta-diversity (UniFrac and Bray-Curtis dissimilarity), heatmap, tagcloud, and plot-bar analyses using the MiSeq Reporter Metagenomics Workflow and R packages (phyloseq, vegan, tagcloud). Shannon index varied in a rather narrow range (4.4-4.9) with the lowest values for microbial communities under PW treatment. Chao1 index varied from 385 to 480, being a more flexible

  14. High-throughput pseudovirion-based neutralization assay for analysis of natural and vaccine-induced antibodies against human papillomaviruses.

    Directory of Open Access Journals (Sweden)

    Peter Sehr

    Full Text Available A highly sensitive, automated, purely add-on, high-throughput pseudovirion-based neutralization assay (HT-PBNA with excellent repeatability and run-to-run reproducibility was developed for human papillomavirus types (HPV 16, 18, 31, 45, 52, 58 and bovine papillomavirus type 1. Preparation of 384 well assay plates with serially diluted sera and the actual cell-based assay are separated in time, therefore batches of up to one hundred assay plates can be processed sequentially. A mean coefficient of variation (CV of 13% was obtained for anti-HPV 16 and HPV 18 titers for a standard serum tested in a total of 58 repeats on individual plates in seven independent runs. Natural antibody response was analyzed in 35 sera from patients with HPV 16 DNA positive cervical intraepithelial neoplasia grade 2+ lesions. The new HT-PBNA is based on Gaussia luciferase with increased sensitivity compared to the previously described manual PBNA (manPBNA based on secreted alkaline phosphatase as reporter. Titers obtained with HT-PBNA were generally higher than titers obtained with the manPBNA. A good linear correlation (R(2 = 0.7 was found between HT-PBNA titers and anti-HPV 16 L1 antibody-levels determined by a Luminex bead-based GST-capture assay for these 35 sera and a Kappa-value of 0.72, with only 3 discordant sera in the low titer range. In addition to natural low titer antibody responses the high sensitivity of the HT-PBNA also allows detection of cross-neutralizing antibodies induced by commercial HPV L1-vaccines and experimental L2-vaccines. When analyzing the WHO international standards for HPV 16 and 18 we determined an analytical sensitivity of 0.864 and 1.105 mIU, respectively.

  15. An UPLC-MS/MS method for highly sensitive high-throughput analysis of phytohormones in plant tissues

    Directory of Open Access Journals (Sweden)

    Balcke Gerd Ulrich

    2012-11-01

    Full Text Available Abstract Background Phytohormones are the key metabolites participating in the regulation of multiple functions of plant organism. Among them, jasmonates, as well as abscisic and salicylic acids are responsible for triggering and modulating plant reactions targeted against pathogens and herbivores, as well as resistance to abiotic stress (drought, UV-irradiation and mechanical wounding. These factors induce dramatic changes in phytohormone biosynthesis and transport leading to rapid local and systemic stress responses. Understanding of underlying mechanisms is of principle interest for scientists working in various areas of plant biology. However, highly sensitive, precise and high-throughput methods for quantification of these phytohormones in small samples of plant tissues are still missing. Results Here we present an LC-MS/MS method for fast and highly sensitive determination of jasmonates, abscisic and salicylic acids. A single-step sample preparation procedure based on mixed-mode solid phase extraction was efficiently combined with essential improvements in mobile phase composition yielding higher efficiency of chromatographic separation and MS-sensitivity. This strategy resulted in dramatic increase in overall sensitivity, allowing successful determination of phytohormones in small (less than 50 mg of fresh weight tissue samples. The method was completely validated in terms of analyte recovery, sensitivity, linearity and precision. Additionally, it was cross-validated with a well-established GC-MS-based procedure and its applicability to a variety of plant species and organs was verified. Conclusion The method can be applied for the analyses of target phytohormones in small tissue samples obtained from any plant species and/or plant part relying on any commercially available (even less sensitive tandem mass spectrometry instrumentation.

  16. Fluorescent-magnetic dual-encoded nanospheres: a promising tool for fast-simultaneous-addressable high-throughput analysis

    Science.gov (United States)

    Xie, Min; Hu, Jun; Wen, Cong-Ying; Zhang, Zhi-Ling; Xie, Hai-Yan; Pang, Dai-Wen

    2012-01-01

    Bead-based optical encoding or magnetic encoding techniques are promising in high-throughput multiplexed detection and separation of numerous species under complicated conditions. Therefore, a self-assembly strategy implemented in an organic solvent is put forward to fabricate fluorescent-magnetic dual-encoded nanospheres. Briefly, hydrophobic trioctylphosphine oxide-capped CdSe/ZnS quantum dots (QDs) and oleic acid-capped nano-γ-Fe2O3 magnetic particles are directly, selectively and controllably assembled on branched poly(ethylene imine)-coated nanospheres without any pretreatment, which is crucial to keep the high quantum yield of QDs and good dispersibility of γ-Fe2O3. Owing to the tunability of coating amounts of QDs and γ-Fe2O3 as well as controllable fluorescent emissions of deposited-QDs, dual-encoded nanospheres with different photoluminescent emissions and gradient magnetic susceptibility are constructed. Using this improved layer-by-layer self-assembly approach, deposition of hydrophobic nanoparticles onto hydrophilic carriers in organic media can be easily realized; meanwhile, fluorescent-magnetic dual-functional nanospheres can be further equipped with readable optical and magnetic addresses. The resultant fluorescent-magnetic dual-encoded nanospheres possess both the unique optical properties of QDs and the superparamagnetic properties of γ-Fe2O3, exhibiting good monodispersibility, huge encoding capacity and nanoscale particle size. Compared with the encoded microbeads reported by others, the nanometre scale of the dual-encoded nanospheres gives them minimum steric hindrance and higher flexibility.

  17. Variant-aware saturating mutagenesis using multiple Cas9 nucleases identifies regulatory elements at trait-associated loci.

    Science.gov (United States)

    Canver, Matthew C; Lessard, Samuel; Pinello, Luca; Wu, Yuxuan; Ilboudo, Yann; Stern, Emily N; Needleman, Austen J; Galactéros, Frédéric; Brugnara, Carlo; Kutlar, Abdullah; McKenzie, Colin; Reid, Marvin; Chen, Diane D; Das, Partha Pratim; A Cole, Mitchel; Zeng, Jing; Kurita, Ryo; Nakamura, Yukio; Yuan, Guo-Cheng; Lettre, Guillaume; Bauer, Daniel E; Orkin, Stuart H

    2017-04-01

    Cas9-mediated, high-throughput, saturating in situ mutagenesis permits fine-mapping of function across genomic segments. Disease- and trait-associated variants identified in genome-wide association studies largely cluster at regulatory loci. Here we demonstrate the use of multiple designer nucleases and variant-aware library design to interrogate trait-associated regulatory DNA at high resolution. We developed a computational tool for the creation of saturating-mutagenesis libraries with single or multiple nucleases with incorporation of variants. We applied this methodology to the HBS1L-MYB intergenic region, which is associated with red-blood-cell traits, including fetal hemoglobin levels. This approach identified putative regulatory elements that control MYB expression. Analysis of genomic copy number highlighted potential false-positive regions, thus emphasizing the importance of off-target analysis in the design of saturating-mutagenesis experiments. Together, these data establish a widely applicable high-throughput and high-resolution methodology to identify minimal functional sequences within large disease- and trait-associated regions.

  18. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  19. Saturation of an intra-gene pool linkage map: towards a unified consensus linkage map for fine mapping and synteny analysis in common bean.

    Science.gov (United States)

    Galeano, Carlos H; Fernandez, Andrea C; Franco-Herrera, Natalia; Cichy, Karen A; McClean, Phillip E; Vanderleyden, Jos; Blair, Matthew W

    2011-01-01

    Map-based cloning and fine mapping to find genes of interest and marker assisted selection (MAS) requires good genetic maps with reproducible markers. In this study, we saturated the linkage map of the intra-gene pool population of common bean DOR364 × BAT477 (DB) by evaluating 2,706 molecular markers including SSR, SNP, and gene-based markers. On average the polymorphism rate was 7.7% due to the narrow genetic base between the parents. The DB linkage map consisted of 291 markers with a total map length of 1,788 cM. A consensus map was built using the core mapping populations derived from inter-gene pool crosses: DOR364 × G19833 (DG) and BAT93 × JALO EEP558 (BJ). The consensus map consisted of a total of 1,010 markers mapped, with a total map length of 2,041 cM across 11 linkage groups. On average, each linkage group on the consensus map contained 91 markers of which 83% were single copy markers. Finally, a synteny analysis was carried out using our highly saturated consensus maps compared with the soybean pseudo-chromosome assembly. A total of 772 marker sequences were compared with the soybean genome. A total of 44 syntenic blocks were identified. The linkage group Pv6 presented the most diverse pattern of synteny with seven syntenic blocks, and Pv9 showed the most consistent relations with soybean with just two syntenic blocks. Additionally, a co-linear analysis using common bean transcript map information against soybean coding sequences (CDS) revealed the relationship with 787 soybean genes. The common bean consensus map has allowed us to map a larger number of markers, to obtain a more complete coverage of the common bean genome. Our results, combined with synteny relationships provide tools to increase marker density in selected genomic regions to identify closely linked polymorphic markers for indirect selection, fine mapping or for positional cloning.

  20. A simple, high throughput method to locate single copy sequences from Bacterial Artificial Chromosome (BAC libraries using High Resolution Melt analysis

    Directory of Open Access Journals (Sweden)

    Caligari Peter DS

    2010-05-01

    Full Text Available Abstract Background The high-throughput anchoring of genetic markers into contigs is required for many ongoing physical mapping projects. Multidimentional BAC pooling strategies for PCR-based screening of large insert libraries is a widely used alternative to high density filter hybridisation of bacterial colonies. To date, concerns over reliability have led most if not all groups engaged in high throughput physical mapping projects to favour BAC DNA isolation prior to amplification by conventional PCR. Results Here, we report the first combined use of Multiplex Tandem PCR (MT-PCR and High Resolution Melt (HRM analysis on bacterial stocks of BAC library superpools as a means of rapidly anchoring markers to BAC colonies and thereby to integrate genetic and physical maps. We exemplify the approach using a BAC library of the model plant Arabidopsis thaliana. Super pools of twenty five 384-well plates and two-dimension matrix pools of the BAC library were prepared for marker screening. The entire procedure only requires around 3 h to anchor one marker. Conclusions A pre-amplification step during MT-PCR allows high multiplexing and increases the sensitivity and reliability of subsequent HRM discrimination. This simple gel-free protocol is more reliable, faster and far less costly than conventional PCR screening. The option to screen in parallel 3 genetic markers in one MT-PCR-HRM reaction using templates from directly pooled bacterial stocks of BAC-containing bacteria further reduces time for anchoring markers in physical maps of species with large genomes.

  1. KUJIRA, a package of integrated modules for systematic and interactive analysis of NMR data directed to high-throughput NMR structure studies

    International Nuclear Information System (INIS)

    Kobayashi, Naohiro; Iwahara, Junji; Koshiba, Seizo; Tomizawa, Tadashi; Tochio, Naoya; Guentert, Peter; Kigawa, Takanori; Yokoyama, Shigeyuki

    2007-01-01

    The recent expansion of structural genomics has increased the demands for quick and accurate protein structure determination by NMR spectroscopy. The conventional strategy without an automated protocol can no longer satisfy the needs of high-throughput application to a large number of proteins, with each data set including many NMR spectra, chemical shifts, NOE assignments, and calculated structures. We have developed the new software KUJIRA, a package of integrated modules for the systematic and interactive analysis of NMR data, which is designed to reduce the tediousness of organizing and manipulating a large number of NMR data sets. In combination with CYANA, the program for automated NOE assignment and structure determination, we have established a robust and highly optimized strategy for comprehensive protein structure analysis. An application of KUJIRA in accordance with our new strategy was carried out by a non-expert in NMR structure analysis, demonstrating that the accurate assignment of the chemical shifts and a high-quality structure of a small protein can be completed in a few weeks. The high completeness of the chemical shift assignment and the NOE assignment achieved by the systematic analysis using KUJIRA and CYANA led, in practice, to increased reliability of the determined structure

  2. A flow cytometry-based method for a high-throughput analysis of drug-stabilized topoisomerase II cleavage complexes in human cells.

    Science.gov (United States)

    de Campos-Nebel, Marcelo; Palmitelli, Micaela; González-Cid, Marcela

    2016-09-01

    Topoisomerase II (Top2) is an important target for anticancer therapy. A variety of drugs that poison Top2, including several epipodophyllotoxins, anthracyclines, and anthracenediones, are widely used in the clinic for both hematologic and solid tumors. The poisoning of Top2 involves the formation of a reaction intermediate Top2-DNA, termed Top2 cleavage complex (Top2cc), which is persistent in the presence of the drug and involves a 5' end of DNA covalently bound to a tyrosine from the enzyme through a phosphodiester group. Drug-induced Top2cc leads to Top2 linked-DNA breaks which are the major responsible for their cytotoxicity. While biochemical detection is very laborious, quantification of drug-induced Top2cc by immunofluorescence-based microscopy techniques is time consuming and requires extensive image segmentation for the analysis of a small population of cells. Here, we developed a flow cytometry-based method for the analysis of drug-induced Top2cc. This method allows a rapid analysis of a high number of cells in their cell cycle phase context. Moreover, it can be applied to almost any human cell type, including clinical samples. The methodology is useful for a high-throughput analysis of drugs that poison Top2, allowing not just the discrimination of the Top2 isoform that is targeted but also to track its removal. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  3. Gluon Saturation and EIC

    Energy Technology Data Exchange (ETDEWEB)

    Sichtermann, Ernst

    2016-12-15

    The fundamental structure of nucleons and nuclear matter is described by the properties and dynamics of quarks and gluons in quantum chromodynamics. Electron-nucleon collisions are a powerful method to study this structure. As one increases the energy of the collisions, the interaction process probes regions of progressively higher gluon density. This density must eventually saturate. An high-energy polarized Electron-Ion Collider (EIC) has been proposed to observe and study the saturated gluon density regime. Selected measurements will be discussed, following a brief introduction.

  4. Competitive Advantage or Market Saturation: An In-Depth Comparison of Flash-Sale Sites Through Content Analysis

    OpenAIRE

    Aday, J. B.; Phelan, K. V.

    2015-01-01

    This study aims to identify the representativeness of hospitality and service industry firms on flash-sale sites such as Groupon and LivingSocial. Currently, academic findings related to the frequency of offerings from these firms are nonexistent. This research relied upon a content analysis rubric and daily measurement of offerings from randomly selected cities represented by Groupon and LivingSocial for a period of 6 weeks. The daily offerings for specific cities on the Groupon and LivingSo...

  5. Transient simulation and sensitivity analysis for transport of radionuclides in a saturated-unsaturated groundwater flow system

    International Nuclear Information System (INIS)

    Chen, H.H.

    1980-01-01

    Radionuclide transport by groundwater flow is an important pathway in the assessment of the environmental impact of radioactive waste disposal to the biosphere. A numerical model was developed to simulate radionuclide transport by groundwater flow and predict the radionuclide discharge rate to the biosphere. A sensitivity analysis methodology was developed to address the sensitivity of the input parameters of the radionuclide transport equation to the specified response of interest

  6. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  7. High saturated-fat and low-fibre intake: a comparative analysis of nutrient intake in individuals with and without type 2 diabetes.

    Science.gov (United States)

    Breen, C; Ryan, M; McNulty, B; Gibney, M J; Canavan, R; O'Shea, D

    2014-02-03

    The aim of dietary modification, as a cornerstone of type 2 diabetes (T2DM) management, is to optimise metabolic control and overall health. This study describes food and nutrient intake in a sample of adults with T2DM, and compares this to recommendations, and to intake in age, sex, body mass index (BMI) and social-class matched adults without T2DM. A cross-sectional analysis of food and nutrient intake in 124 T2DM individuals (64% male; age 57.4±5.6 years, BMI 32.5±5.8 kg m(-2)) and 124 adults (age 57.4±7.0 years, BMI 31.2±5.0 kg m(-2)) with no diabetes (ND) was undertaken using a 4-day semiweighed food diary. Biochemical and anthropometric variables were also measured. While reported energy intake was similar in T2DM vs ND (1954 vs 2004 kcal per day, P=0.99), T2DM subjects consumed more total-fat (38.8% vs 35%, P0.001), monounsaturated-fat (13.3% vs 12.2%; P=0.004), polyunsaturated-fat (6.7% vs 5.9%; Pvs 17.5%, P0.01). Both groups exceeded saturated-fat recommendations (14.0% vs 13.8%). T2DM intakes of carbohydrate (39.5% vs 42.9%), non-milk sugar (10.4% vs 15.0%) and fibre (14.4 vs 18.9 g) were significantly lower (Pvs 129.2; P=0.02), despite a similar glycaemic index (59.7 vs 60.1; P=0.48). T2DM individuals reported consuming significantly more wholemeal/brown/wholegrain breads, eggs, oils, vegetables, meat/meat products, savoury snacks and soups/sauces and less white breads, breakfast cereals, cakes/buns, full-fat dairy, chocolate, fruit juices, oily fish and alcohol than ND controls. Adults with T2DM made different food choices to ND adults. This resulted in a high saturated-fat diet, with a higher total-fat, monounsaturated-fat, polyunsaturated-fat and protein content and a lower GL, carbohydrate, fibre and non-milk sugar content. Dietary education should emphasise and reinforce the importance of higher fibre, fruit, vegetable and wholegrain intake and the substitution of monounsaturated for saturated-fat sources, in energy balanced

  8. Green throughput taxation

    International Nuclear Information System (INIS)

    Bruvoll, A.; Ibenholt, K.

    1998-01-01

    According to optimal taxation theory, raw materials should be taxed to capture the embedded scarcity rent in their value. To reduce both natural resource use and the corresponding emissions, or the throughput in the economic system, the best policy may be a tax on material inputs. As a first approach to throughput taxation, this paper considers a tax on intermediates in the framework of a dynamic computable general equilibrium model with environmental feedbacks. To balance the budget, payroll taxes are reduced. As a result, welfare indicators as material consumption and leisure time consumption are reduced, while on the other hand all the environmental indicators improve. 27 refs

  9. Throughput rate study

    International Nuclear Information System (INIS)

    Ford, L.; Bailey, W.; Gottlieb, P.; Emami, F.; Fleming, M.; Robertson, D.

    1993-01-01

    The Civilian Radioactive Waste Management System (CRWMS) Management and Operating (M ampersand O) Contractor, has completed a study to analyze system wide impacts of operating the CRWMS at varying throughput rates, including the 3000 MTU/year rate which has been assumed in the past. Impacts of throughput rate on all phases of the CRWMS operations (acceptance, transportation, storage and disposal) were evaluated. The results of the study indicate that a range from 3000 to 5000 MTU/year is preferred, based on system cost per MTU of SNF emplaced and logistics constraints

  10. High throughput, cell type-specific analysis of key proteins in human endometrial biopsies of women from fertile and infertile couples

    Science.gov (United States)

    Leach, Richard E.; Jessmon, Philip; Coutifaris, Christos; Kruger, Michael; Myers, Evan R.; Ali-Fehmi, Rouba; Carson, Sandra A.; Legro, Richard S.; Schlaff, William D.; Carr, Bruce R.; Steinkampf, Michael P.; Silva, Susan; Leppert, Phyllis C.; Giudice, Linda; Diamond, Michael P.; Armant, D. Randall

    2012-01-01

    BACKGROUND Although histological dating of endometrial biopsies provides little help for prediction or diagnosis of infertility, analysis of individual endometrial proteins, proteomic profiling and transcriptome analysis have suggested several biomarkers with altered expression arising from intrinsic abnormalities, inadequate stimulation by or in response to gonadal steroids or altered function due to systemic disorders. The objective of this study was to delineate the developmental dynamics of potentially important proteins in the secretory phase of the menstrual cycle, utilizing a collection of endometrial biopsies from women of fertile (n = 89) and infertile (n = 89) couples. METHODS AND RESULTS Progesterone receptor-B (PGR-B), leukemia inhibitory factor, glycodelin/progestagen-associated endometrial protein (PAEP), homeobox A10, heparin-binding EGF-like growth factor, calcitonin and chemokine ligand 14 (CXCL14) were measured using a high-throughput, quantitative immunohistochemical method. Significant cyclic and tissue-specific regulation was documented for each protein, as well as their dysregulation in women of infertile couples. Infertile patients demonstrated a delay early in the secretory phase in the decline of PGR-B (P localization provided important insights into the potential roles of these proteins in normal and pathological development of the endometrium that is not attainable from transcriptome analysis, establishing a basis for biomarker, diagnostic and targeted drug development for women with infertility. PMID:22215622

  11. Sequence requirements of the HIV-1 protease flap region determined by saturation mutagenesis and kinetic analysis of flap mutants

    Science.gov (United States)

    Shao, Wei; Everitt, Lorraine; Manchester, Marianne; Loeb, Daniel D.; Hutchison, Clyde A.; Swanstrom, Ronald

    1997-01-01

    The retroviral proteases (PRs) have a structural feature called the flap, which consists of a short antiparallel β-sheet with a turn. The flap extends over the substrate binding cleft and must be flexible to allow entry and exit of the polypeptide substrates and products. We analyzed the sequence requirements of the amino acids within the flap region (positions 46–56) of the HIV-1 PR. The phenotypes of 131 substitution mutants were determined using a bacterial expression system. Four of the mutant PRs with mutations in different regions of the flap were selected for kinetic analysis. Our phenotypic analysis, considered in the context of published structures of the HIV-1 PR with a bound substrate analogs, shows that: (i) Met-46 and Phe-53 participate in hydrophobic interactions on the solvent-exposed face of the flap; (ii) Ile-47, Ile-54, and Val-56 participate in hydrophobic interactions on the inner face of the flap; (iii) Ile-50 has hydrophobic interactions at the distance of both the δ and γ carbons; (iv) the three glycine residues in the β-turn of the flap are virtually intolerant of substitutions. Among these mutant PRs, we have identified changes in both kcat and Km. These results establish the nature of the side chain requirements at each position in the flap and document a role for the flap in both substrate binding and catalysis. PMID:9122179

  12. Model investigations for trace analysis of iodine, uranium, and technetium in saturated sodium chloride leaching solutions of stored radioactive waste

    International Nuclear Information System (INIS)

    Jegle, U.

    1989-02-01

    This paper describes the development of a time and cost saving chromatographic technique, which allows the matrix to be separated and the most important species to be analyzed in a leaching solution of vitrified radioactive waste. Uranium, iodine, and technetium were chosen for the model technique to be elaborated. In a first step, iodide and pertechnetate were separated from the matrix by the strongly basic AG 1X 8 anion exchange resin and then separated from each other by selective elution. The uranyl ions eluted with the sodium chloride matrix were separated from the excess of sodium chloride in a second step, again by adsorption to the strongly basic resin. The ion-selective electrode was found to be a suitable tool for iodide analysis. Pertechnetate was analysed by means of liquid scintillation. Uranium was determined by ICP-AES. (orig./RB) [de

  13. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    Energy Technology Data Exchange (ETDEWEB)

    Harding, Louisa B. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Schultz, Irvin R. [Battelle, Marine Sciences Laboratory – Pacific Northwest National Laboratory, 1529 West Sequim Bay Road, Sequim, WA 98382 (United States); Goetz, Giles W. [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Luckenbach, J. Adam [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Young, Graham [School of Aquatic and Fishery Sciences, University of Washington, Seattle, WA 98195 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States); Goetz, Frederick W. [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, Manchester Research Station, P.O. Box 130, Manchester, WA 98353 (United States); Swanson, Penny, E-mail: penny.swanson@noaa.gov [Northwest Fisheries Science Center, National Marine Fisheries Service, National Oceanic and Atmospheric Administration, 2725 Montlake Blvd E, Seattle, WA 98112 (United States); Center for Reproductive Biology, Washington State University, Pullman, WA 98164 (United States)

    2013-10-15

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina{sup ®} sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  14. High-throughput sequencing and pathway analysis reveal alteration of the pituitary transcriptome by 17α-ethynylestradiol (EE2) in female coho salmon, Oncorhynchus kisutch

    International Nuclear Information System (INIS)

    Harding, Louisa B.; Schultz, Irvin R.; Goetz, Giles W.; Luckenbach, J. Adam; Young, Graham; Goetz, Frederick W.; Swanson, Penny

    2013-01-01

    Highlights: •Studied impacts of ethynylestradiol (EE2) exposure on salmon pituitary transcriptome. •High-throughput sequencing, RNAseq, and pathway analysis were performed. •EE2 altered mRNAs for genes in circadian rhythm, GnRH, and TGFβ signaling pathways. •LH and FSH beta subunit mRNAs were most highly up- and down-regulated by EE2, respectively. •Estrogens may alter processes associated with reproductive timing in salmon. -- Abstract: Considerable research has been done on the effects of endocrine disrupting chemicals (EDCs) on reproduction and gene expression in the brain, liver and gonads of teleost fish, but information on impacts to the pituitary gland are still limited despite its central role in regulating reproduction. The aim of this study was to further our understanding of the potential effects of natural and synthetic estrogens on the brain–pituitary–gonad axis in fish by determining the effects of 17α-ethynylestradiol (EE2) on the pituitary transcriptome. We exposed sub-adult coho salmon (Oncorhynchus kisutch) to 0 or 12 ng EE2/L for up to 6 weeks and effects on the pituitary transcriptome of females were assessed using high-throughput Illumina ® sequencing, RNA-Seq and pathway analysis. After 1 or 6 weeks, 218 and 670 contiguous sequences (contigs) respectively, were differentially expressed in pituitaries of EE2-exposed fish relative to control. Two of the most highly up- and down-regulated contigs were luteinizing hormone β subunit (241-fold and 395-fold at 1 and 6 weeks, respectively) and follicle-stimulating hormone β subunit (−3.4-fold at 6 weeks). Additional contigs related to gonadotropin synthesis and release were differentially expressed in EE2-exposed fish relative to controls. These included contigs involved in gonadotropin releasing hormone (GNRH) and transforming growth factor-β signaling. There was an over-representation of significantly affected contigs in 33 and 18 canonical pathways at 1 and 6 weeks

  15. Criteria for saturated magnetization loop

    International Nuclear Information System (INIS)

    Harres, A.; Mikhov, M.; Skumryev, V.; Andrade, A.M.H. de; Schmidt, J.E.; Geshev, J.

    2016-01-01

    Proper estimation of magnetization curve parameters is vital in studying magnetic systems. In the present article, criteria for discrimination non-saturated (minor) from saturated (major) hysteresis loops are proposed. These employ the analysis of (i) derivatives of both ascending and descending branches of the loop, (ii) remanent magnetization curves, and (iii) thermomagnetic curves. Computational simulations are used in order to demonstrate their validity. Examples illustrating the applicability of these criteria to well-known real systems, namely Fe_3O_4 and Ni fine particles, are provided. We demonstrate that the anisotropy-field value estimated from a visual examination of an only apparently major hysteresis loop could be more than two times lower than the real one. - Highlights: • Proper estimation of hysteresis-loop parameters is vital in magnetic studies. • We propose criteria for discrimination minor from major hysteresis loops. • The criteria analyze magnetization, remanence and ZFC/FC curves and/or their derivatives. • Examples of their application on real nanoparticles systems are given. • Using the criteria could avoid twofold or bigger saturation-field underestimation errors.

  16. Criteria for saturated magnetization loop

    Energy Technology Data Exchange (ETDEWEB)

    Harres, A. [Departamento de Física, UFSM, Santa Maria, 97105-900 Rio Grande do Sul (Brazil); Mikhov, M. [Faculty of Physics, University of Sofia, 1164 Sofia (Bulgaria); Skumryev, V. [Institució Catalana de Recerca i Estudis Avançats, 08010 Barcelona (Spain); Departament de Física, Universitat Autònoma de Barcelona, 08193 Barcelona (Spain); Andrade, A.M.H. de; Schmidt, J.E. [Instituto de Física, UFRGS, Porto Alegre, 91501-970 Rio Grande do Sul (Brazil); Geshev, J., E-mail: julian@if.ufrgs.br [Departament de Física, Universitat Autònoma de Barcelona, 08193 Barcelona (Spain); Instituto de Física, UFRGS, Porto Alegre, 91501-970 Rio Grande do Sul (Brazil)

    2016-03-15

    Proper estimation of magnetization curve parameters is vital in studying magnetic systems. In the present article, criteria for discrimination non-saturated (minor) from saturated (major) hysteresis loops are proposed. These employ the analysis of (i) derivatives of both ascending and descending branches of the loop, (ii) remanent magnetization curves, and (iii) thermomagnetic curves. Computational simulations are used in order to demonstrate their validity. Examples illustrating the applicability of these criteria to well-known real systems, namely Fe{sub 3}O{sub 4} and Ni fine particles, are provided. We demonstrate that the anisotropy-field value estimated from a visual examination of an only apparently major hysteresis loop could be more than two times lower than the real one. - Highlights: • Proper estimation of hysteresis-loop parameters is vital in magnetic studies. • We propose criteria for discrimination minor from major hysteresis loops. • The criteria analyze magnetization, remanence and ZFC/FC curves and/or their derivatives. • Examples of their application on real nanoparticles systems are given. • Using the criteria could avoid twofold or bigger saturation-field underestimation errors.

  17. Quasi-three-dimensional analysis of ground water flow and dissolved multicomponent solute transport in saturated porous media

    International Nuclear Information System (INIS)

    Tang, Yi.

    1991-01-01

    A computational procedure was developed in this study to provide flexibility needed in the application of three-dimensional groundwater flow and dissolved multicomponent solute transport simulations. In the first part of this study, analytical solutions were proposed for the dissolved single-component solute transport problem. These closed form solutions were developed for homogeneous but stratified porous media. This analytical model took into account two-dimensional diffusion-advection in the main aquifer layer and one-dimensional diffusion-advection in the adjacent aquitards, as well as first order radioactive decay and linear adsorption isotherm in both aquifer and aquitards. The associated analytical solutions for solute concentration distributions in the aquifer and aquitards were obtained using Laplace Transformation and Method of Separation of Variables techniques. Next, in order to analyze the problem numerically, a quasi-three-dimensional finite element algorithm was developed based on the multilayer aquifer concept. In this phase, advection, dispersion, adsorption and first order multi-species chemical reaction terms were included to the analysis. Employing this model, without restriction on groundwater flow pattern in the multilayer aquifer system, one may analyze the complex behavior of the groundwater flow and solute movement pattern in the system. These numerical models may be utilized as calibration tools in site characterization studies, or as predictive models during the initial stages of a typical site investigation study. Through application to several test and field problems, the usefulness, accuracy and efficiency of the proposed models were demonstrated. Comparison of results with analytical solution, experimental data and other numerical methods were also discussed

  18. Comparative analysis of transcriptomes in aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing

    Directory of Open Access Journals (Sweden)

    Taketo Okada

    2016-12-01

    Full Text Available Ephedra plants are taxonomically classified as gymnosperms, and are medicinally important as the botanical origin of crude drugs and as bioresources that contain pharmacologically active chemicals. Here we show a comparative analysis of the transcriptomes of aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing by RNA-Seq. De novo assembly of short cDNA sequence reads generated 23,358, 13,373, and 28,579 contigs longer than 200 bases from aerial stems, roots, or both aerial stems and roots, respectively. The presumed functions encoded by these contig sequences were annotated by BLAST (blastx. Subsequently, these contigs were classified based on gene ontology slims, Enzyme Commission numbers, and the InterPro database. Furthermore, comparative gene expression analysis was performed between aerial stems and roots. These transcriptome analyses revealed differences and similarities between the transcriptomes of aerial stems and roots in E. sinica. Deep transcriptome sequencing of Ephedra should open the door to molecular biological studies based on the entire transcriptome, tissue- or organ-specific transcriptomes, or targeted genes of interest.

  19. Isolation of Exosome-Like Nanoparticles and Analysis of MicroRNAs Derived from Coconut Water Based on Small RNA High-Throughput Sequencing.

    Science.gov (United States)

    Zhao, Zhehao; Yu, Siran; Li, Min; Gui, Xin; Li, Ping

    2018-03-21

    In this study, the presence of microRNAs in coconut water was identified by real-time polymerase chain reaction (PCR) based on the results of high-throughput small RNA sequencing. In addition, the differences in microRNA content between immature and mature coconut water were compared. A total of 47 known microRNAs belonging to 25 families and 14 new microRNAs were identified in coconut endosperm. Through analysis using a target gene prediction software, potential microRNA target genes were identified in the human genome. Real-time PCR showed that the level of most microRNAs was higher in mature coconut water than in immature coconut water. Then, exosome-like nanoparticles were isolated from coconut water. After ultracentrifugation, some particle structures were seen in coconut water samples using 1,1'-dioctadecyl-3,3,3',3'-tetramethylindocarbocyanine perchlorate fluorescence staining. Subsequent scanning electron microscopy observation and dynamic light scattering analysis also revealed some exosome-like nanoparticles in coconut water, and the mean diameters of the particles detected by the two methods were 13.16 and 59.72 nm, respectively. In conclusion, there are extracellular microRNAs in coconut water, and their levels are higher in mature coconut water than in immature coconut water. Some exosome-like nanoparticles were isolated from coconut water, and the diameter of these particles was smaller than that of animal-derived exosomes.

  20. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    Directory of Open Access Journals (Sweden)

    Krithika Bhuvaneshwar

    2015-01-01

    Full Text Available Next generation sequencing (NGS technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.

  1. Modeling and sensitivity analysis on the transport of aluminum oxide nanoparticles in saturated sand: effects of ionic strength, flow rate, and nanoparticle concentration.

    Science.gov (United States)

    Rahman, Tanzina; Millwater, Harry; Shipley, Heather J

    2014-11-15

    Aluminum oxide nanoparticles have been widely used in various consumer products and there are growing concerns regarding their exposure in the environment. This study deals with the modeling, sensitivity analysis and uncertainty quantification of one-dimensional transport of nano-sized (~82 nm) aluminum oxide particles in saturated sand. The transport of aluminum oxide nanoparticles was modeled using a two-kinetic-site model with a blocking function. The modeling was done at different ionic strengths, flow rates, and nanoparticle concentrations. The two sites representing fast and slow attachments along with a blocking term yielded good agreement with the experimental results from the column studies of aluminum oxide nanoparticles. The same model was used to simulate breakthrough curves under different conditions using experimental data and calculated 95% confidence bounds of the generated breakthroughs. The sensitivity analysis results showed that slow attachment was the most sensitive parameter for high influent concentrations (e.g. 150 mg/L Al2O3) and the maximum solid phase retention capacity (related to blocking function) was the most sensitive parameter for low concentrations (e.g. 50 mg/L Al2O3). Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Specification of a test problem for HYDROCOIN [Hydrologic Code Intercomparison] Level 3 Case 2: Sensitivity analysis for deep disposal in partially saturated, fractured tuff

    International Nuclear Information System (INIS)

    Prindle, R.W.

    1987-08-01

    The international Hydrologic Code Intercomparison Project (HYDROCOIN) was formed to evaluate hydrogeologic models and computer codes and their use in performance assessment for high-level radioactive waste repositories. Three principal activities in the HYDROCOIN Project are Level 1, verification and benchmarking of hydrologic codes; Level 2, validation of hydrologic models; and Level 3, sensitivity and uncertainty analyses of the models and codes. This report presents a test case defined for the HYDROCOIN Level 3 activity to explore the feasibility of applying various sensitivity-analysis methodologies to a highly nonlinear model of isothermal, partially saturated flow through fractured tuff, and to develop modeling approaches to implement the methodologies for sensitivity analysis. These analyses involve an idealized representation of a repository sited above the water table in a layered sequence of welded and nonwelded, fractured, volcanic tuffs. The analyses suggested here include one-dimensional, steady flow; one-dimensional, nonsteady flow; and two-dimensional, steady flow. Performance measures to be used to evaluate model sensitivities are also defined; the measures are related to regulatory criteria for containment of high-level radioactive waste. 14 refs., 5 figs., 4 tabs

  3. An analysis of sodium, total fat and saturated fat contents of packaged food products advertised in Bronx-based supermarket circulars.

    Science.gov (United States)

    Samuel, L; Basch, C H; Ethan, D; Hammond, R; Chiazzese, K

    2014-08-01

    Americans' consumption of sodium, fat, and saturated fat exceed federally recommended limits for these nutrients and has been identified as a preventable leading cause of hypertension and cardiovascular disease. More than 40% of the Bronx population comprises African-Americans, who have increased risk and earlier onset of hypertension and are also genetically predisposed to salt-sensitive hypertension. This study analyzed nutrition information for packaged foods advertised in Bronx-based supermarket circulars. Federally recommended limits for sodium, saturated fat and total fat contents were used to identify foods that were high in these nutrients. The proportion of these products with respect to the total number of packaged foods was calculated. More than a third (35%) and almost a quarter (24%) of the 898 advertised packaged foods were high in saturated fat and sodium respectively. Such foods predominantly included processed meat and fish products, fast foods, meals, entrees and side dishes. Dairy and egg products were the greatest contributors of high saturated fat. Pork and beef products, fast foods, meals, entrees and side dishes had the highest median values for sodium, total fat and saturated fat content. The high proportion of packaged foods that are high in sodium and/or saturated fat promoted through supermarket circulars highlights the need for nutrition education among consumers as well as collaborative public health measures by the food industry, community and government agencies to reduce the amounts of sodium and saturated fat in these products and limit the promotion of foods that are high in these nutrients.

  4. Close-range hyperspectral image analysis for the early detection of stress responses in individual plants in a high-throughput phenotyping platform

    Science.gov (United States)

    Mohd Asaari, Mohd Shahrimie; Mishra, Puneet; Mertens, Stien; Dhondt, Stijn; Inzé, Dirk; Wuyts, Nathalie; Scheunders, Paul

    2018-04-01

    The potential of close-range hyperspectral imaging (HSI) as a tool for detecting early drought stress responses in plants grown in a high-throughput plant phenotyping platform (HTPPP) was explored. Reflectance spectra from leaves in close-range imaging are highly influenced by plant geometry and its specific alignment towards the imaging system. This induces high uninformative variability in the recorded signals, whereas the spectral signature informing on plant biological traits remains undisclosed. A linear reflectance model that describes the effect of the distance and orientation of each pixel of a plant with respect to the imaging system was applied. By solving this model for the linear coefficients, the spectra were corrected for the uninformative illumination effects. This approach, however, was constrained by the requirement of a reference spectrum, which was difficult to obtain. As an alternative, the standard normal variate (SNV) normalisation method was applied to reduce this uninformative variability. Once the envisioned illumination effects were eliminated, the remaining differences in plant spectra were assumed to be related to changes in plant traits. To distinguish the stress-related phenomena from regular growth dynamics, a spectral analysis procedure was developed based on clustering, a supervised band selection, and a direct calculation of a spectral similarity measure against a reference. To test the significance of the discrimination between healthy and stressed plants, a statistical test was conducted using a one-way analysis of variance (ANOVA) technique. The proposed analysis techniques was validated with HSI data of maize plants (Zea mays L.) acquired in a HTPPP for early detection of drought stress in maize plant. Results showed that the pre-processing of reflectance spectra with the SNV effectively reduces the variability due to the expected illumination effects. The proposed spectral analysis method on the normalized spectra successfully

  5. Systematic Analysis of the Association between Gut Flora and Obesity through High-Throughput Sequencing and Bioinformatics Approaches

    Directory of Open Access Journals (Sweden)

    Chih-Min Chiu

    2014-01-01

    Full Text Available Eighty-one stool samples from Taiwanese were collected for analysis of the association between the gut flora and obesity. The supervised analysis showed that the most, abundant genera of bacteria in normal samples (from people with a body mass index (BMI ≤ 24 were Bacteroides (27.7%, Prevotella (19.4%, Escherichia (12%, Phascolarctobacterium (3.9%, and Eubacterium (3.5%. The most abundant genera of bacteria in case samples (with a BMI ≥ 27 were Bacteroides (29%, Prevotella (21%, Escherichia (7.4%, Megamonas (5.1%, and Phascolarctobacterium (3.8%. A principal coordinate analysis (PCoA demonstrated that normal samples were clustered more compactly than case samples. An unsupervised analysis demonstrated that bacterial communities in the gut were clustered into two main groups: N-like and OB-like groups. Remarkably, most normal samples (78% were clustered in the N-like group, and most case samples (81% were clustered in the OB-like group (Fisher’s P  value=1.61E-07. The results showed that bacterial communities in the gut were highly associated with obesity. This is the first study in Taiwan to investigate the association between human gut flora and obesity, and the results provide new insights into the correlation of bacteria with the rising trend in obesity.

  6. Utility of lab-on-a-chip technology for high-throughput nucleic acid and protein analysis

    DEFF Research Database (Denmark)

    Hawtin, Paul; Hardern, Ian; Wittig, Rainer

    2005-01-01

    On-chip electrophoresis can provide size separations of nucleic acids and proteins similar to more traditional slab gel electrophoresis. Lab-on-a-chip (LoaC) systems utilize on-chip electrophoresis in conjunction with sizing calibration, sensitive detection schemes, and sophisticated data analysi...

  7. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  8. Identification and characterization of microRNAs related to salt stress in broccoli, using high-throughput sequencing and bioinformatics analysis.

    Science.gov (United States)

    Tian, Yunhong; Tian, Yunming; Luo, Xiaojun; Zhou, Tao; Huang, Zuoping; Liu, Ying; Qiu, Yihan; Hou, Bing; Sun, Dan; Deng, Hongyu; Qian, Shen; Yao, Kaitai

    2014-09-03

    MicroRNAs (miRNAs) are a new class of endogenous regulators of a broad range of physiological processes, which act by regulating gene expression post-transcriptionally. The brassica vegetable, broccoli (Brassica oleracea var. italica), is very popular with a wide range of consumers, but environmental stresses such as salinity are a problem worldwide in restricting its growth and yield. Little is known about the role of miRNAs in the response of broccoli to salt stress. In this study, broccoli subjected to salt stress and broccoli grown under control conditions were analyzed by high-throughput sequencing. Differential miRNA expression was confirmed by real-time reverse transcription polymerase chain reaction (RT-PCR). The prediction of miRNA targets was undertaken using the Kyoto Encyclopedia of Genes and Genomes (KEGG) Orthology (KO) database and Gene Ontology (GO)-enrichment analyses. Two libraries of small (or short) RNAs (sRNAs) were constructed and sequenced by high-throughput Solexa sequencing. A total of 24,511,963 and 21,034,728 clean reads, representing 9,861,236 (40.23%) and 8,574,665 (40.76%) unique reads, were obtained for control and salt-stressed broccoli, respectively. Furthermore, 42 putative known and 39 putative candidate miRNAs that were differentially expressed between control and salt-stressed broccoli were revealed by their read counts and confirmed by the use of stem-loop real-time RT-PCR. Amongst these, the putative conserved miRNAs, miR393 and miR855, and two putative candidate miRNAs, miR3 and miR34, were the most strongly down-regulated when broccoli was salt-stressed, whereas the putative conserved miRNA, miR396a, and the putative candidate miRNA, miR37, were the most up-regulated. Finally, analysis of the predicted gene targets of miRNAs using the GO and KO databases indicated that a range of metabolic and other cellular functions known to be associated with salt stress were up-regulated in broccoli treated with salt. A comprehensive

  9. iMir: an integrated pipeline for high-throughput analysis of small non-coding RNA data obtained by smallRNA-Seq.

    Science.gov (United States)

    Giurato, Giorgio; De Filippo, Maria Rosaria; Rinaldi, Antonio; Hashim, Adnan; Nassa, Giovanni; Ravo, Maria; Rizzo, Francesca; Tarallo, Roberta; Weisz, Alessandro

    2013-12-13

    RNAs. In addition, iMir allowed also the identification of ~70 piRNAs (piwi-interacting RNAs), some of which differentially expressed in proliferating vs growth arrested cells. The integrated data analysis pipeline described here is based on a reliable, flexible and fully automated workflow, useful to rapidly and efficiently analyze high-throughput smallRNA-Seq data, such as those produced by the most recent high-performance next generation sequencers. iMir is available at http://www.labmedmolge.unisa.it/inglese/research/imir.

  10. Optimization of a Differential Ion Mobility Spectrometry-Tandem Mass Spectrometry Method for High-Throughput Analysis of Nicotine and Related Compounds: Application to Electronic Cigarette Refill Liquids.

    Science.gov (United States)

    Regueiro, Jorge; Giri, Anupam; Wenzl, Thomas

    2016-06-21

    Fast market penetration of electronic cigarettes is leading to an exponentially growing number of electronic refill liquids with different nicotine contents and an endless list of flavors. Therefore, rapid and simple methods allowing a fast screening of these products are necessary to detect harmful substances which can negatively impact the health of consumers. In this regard, the present work explores the capabilities of differential ion mobility spectrometry coupled to tandem mass spectrometry for high-throughput analysis of nicotine and 11 related compounds in commercial refill liquids for electronic cigarettes. The influence of main factors affecting the ion mobility separation, such as modifier types and concentration, separation voltage, and temperature, was systematically investigated. Despite small molecular weight differences among the studied compounds, a good separation was achieved in the ion mobility cell under the optimized conditions, which involved the use of ethanol as a polar gas-phase chemical modifier. Indeed, differential ion mobility was able to resolve (resolution >4) nicotine from its structural isomer anabasine without the use of any chromatographic separation. The quantitative performance of the proposed method was then evaluated, showing satisfactory precision (RSD ≤ 16%) and recoveries ranging from 85 to 100% for nicotine, and from 84 to 126% for the rest of the target analytes. Several commercial electronic cigarette refill liquids were analyzed to demonstrate the applicability of the method. In some cases, significant differences were found between labeled and measured levels of nicotine. Anatabine, cotinine, myosmine, and nornicotine were also found in some of the analyzed samples.

  11. High-throughput flow injection analysis mass spectroscopy with networked delivery of color-rendered results. 2. Three-dimensional spectral mapping of 96-well combinatorial chemistry racks.

    Science.gov (United States)

    Görlach, E; Richmond, R; Lewis, I

    1998-08-01

    For the last two years, the mass spectroscopy section of the Novartis Pharma Research Core Technology group has analyzed tens of thousands of multiple parallel synthesis samples from the Novartis Pharma Combinatorial Chemistry program, using an in-house developed automated high-throughput flow injection analysis electrospray ionization mass spectroscopy system. The electrospray spectra of these samples reflect the many structures present after the cleavage step from the solid support. The overall success of the sequential synthesis is mirrored in the purity of the expected end product, but the partial success of individual synthesis steps is evident in the impurities in the mass spectrum. However this latter reaction information, which is of considerable utility to the combinatorial chemist, is effectively hidden from view by the very large number of analyzed samples. This information is now revealed at the workbench of the combinatorial chemist by a novel three-dimensional display of each rack's complete mass spectral ion current using the in-house RackViewer Visual Basic application. Colorization of "forbidden loss" and "forbidden gas-adduct" zones, normalization to expected monoisotopic molecular weight, colorization of ionization intensity, and sorting by row or column were used in combination to highlight systematic patterns in the mass spectroscopy data.

  12. Genome-wide identification and comparative analysis of grafting-responsive mRNA in watermelon grafted onto bottle gourd and squash rootstocks by high-throughput sequencing.

    Science.gov (United States)

    Liu, Na; Yang, Jinghua; Fu, Xinxing; Zhang, Li; Tang, Kai; Guy, Kateta Malangisha; Hu, Zhongyuan; Guo, Shaogui; Xu, Yong; Zhang, Mingfang

    2016-04-01

    Grafting is an important agricultural technique widely used to improve plant growth, yield, and adaptation to either biotic or abiotic stresses. However, the molecular mechanisms underlying grafting-induced physiological processes remain unclear. Watermelon (Citrullus lanatus L.) is an important horticultural crop worldwide. Grafting technique is commonly used in watermelon production for improving its tolerance to stresses, especially to the soil-borne fusarium wilt disease. In the present study, we used high-throughput sequencing to perform a genome-wide transcript analysis of scions from watermelon grafted onto bottle gourd and squash rootstocks. Our transcriptome and digital gene expression (DGE) profiling data provided insights into the molecular aspects of gene regulation in grafted watermelon. Compared with self-grafted watermelon, there were 787 and 3485 genes differentially expressed in watermelon grafted onto bottle gourd and squash rootstocks, respectively. These genes were associated with primary and secondary metabolism, hormone signaling, transcription factors, transporters, and response to stimuli. Grafting led to changes in expression of these genes, suggesting that they may play important roles in mediating the physiological processes of grafted seedlings. The potential roles of the grafting-responsive mRNAs in diverse biological and metabolic processes were discussed. Obviously, the data obtained in this study provide an excellent resource for unraveling the mechanisms of candidate genes function in diverse biological processes and in environmental adaptation in a graft system.

  13. GLINT: a user-friendly toolset for the analysis of high-throughput DNA-methylation array data.

    Science.gov (United States)

    Rahmani, Elior; Yedidim, Reut; Shenhav, Liat; Schweiger, Regev; Weissbrod, Omer; Zaitlen, Noah; Halperin, Eran

    2017-06-15

    GLINT is a user-friendly command-line toolset for fast analysis of genome-wide DNA methylation data generated using the Illumina human methylation arrays. GLINT, which does not require any programming proficiency, allows an easy execution of Epigenome-Wide Association Study analysis pipeline under different models while accounting for known confounders in methylation data. GLINT is a command-line software, freely available at https://github.com/cozygene/glint/releases . It requires Python 2.7 and several freely available Python packages. Further information and documentation as well as a quick start tutorial are available at http://glint-epigenetics.readthedocs.io . elior.rahmani@gmail.com or ehalperin@cs.ucla.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  14. High-Throughput Live-Cell Microscopy Analysis of Association Between Chromosome Domains and the Nucleolus in S. cerevisiae.

    Science.gov (United States)

    Wang, Renjie; Normand, Christophe; Gadal, Olivier

    2016-01-01

    Spatial organization of the genome has important impacts on all aspects of chromosome biology, including transcription, replication, and DNA repair. Frequent interactions of some chromosome domains with specific nuclear compartments, such as the nucleolus, are now well documented using genome-scale methods. However, direct measurement of distance and interaction frequency between loci requires microscopic observation of specific genomic domains and the nucleolus, followed by image analysis to allow quantification. The fluorescent repressor operator system (FROS) is an invaluable method to fluorescently tag DNA sequences and investigate chromosome position and dynamics in living cells. This chapter describes a combination of methods to define motion and region of confinement of a locus relative to the nucleolus in cell's nucleus, from fluorescence acquisition to automated image analysis using two dedicated pipelines.

  15. High-Throughput Analysis of Sucrose Fatty Acid Esters by Supercritical Fluid Chromatography/Tandem Mass Spectrometry

    Science.gov (United States)

    Hori, Katsuhito; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2014-01-01

    Supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry was applied to the profiling of sucrose fatty acid esters (SEs). The SFC conditions (column and modifier gradient) were optimized for the effective separation of SEs. In the column test, a silica gel reversed-phase column was selected. Then, the method was used for the detailed characterization of commercial SEs and the successful analysis of SEs containing different fatty acids. The present method allowed for fast and high-resolution separation of monoesters to tetra-esters within a shorter time (15 min) as compared to the conventional high-performance liquid chromatography. The applicability of our method for the analysis of SEs was thus demonstrated. PMID:26819875

  16. Immunoglobulin G (IgG) Fab glycosylation analysis using a new mass spectrometric high-throughput profiling method reveals pregnancy-associated changes.

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H J; Hensbergen, Paul J; Reiding, Karli R; Hazes, Johanna M W; Dolhain, Radboud J E M; Wuhrer, Manfred

    2014-11-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼ 20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  17. Immunoglobulin G (IgG) Fab Glycosylation Analysis Using a New Mass Spectrometric High-throughput Profiling Method Reveals Pregnancy-associated Changes*

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H. J.; Hensbergen, Paul J.; Reiding, Karli R.; Hazes, Johanna M. W.; Dolhain, Radboud J. E. M.; Wuhrer, Manfred

    2014-01-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding. PMID:25004930

  18. Comprehensive processing of high-throughput small RNA sequencing data including quality checking, normalization, and differential expression analysis using the UEA sRNA Workbench.

    Science.gov (United States)

    Beckers, Matthew; Mohorianu, Irina; Stocks, Matthew; Applegate, Christopher; Dalmay, Tamas; Moulton, Vincent

    2017-06-01

    Recently, high-throughput sequencing (HTS) has revealed compelling details about the small RNA (sRNA) population in eukaryotes. These 20 to 25 nt noncoding RNAs can influence gene expression by acting as guides for the sequence-specific regulatory mechanism known as RNA silencing. The increase in sequencing depth and number of samples per project enables a better understanding of the role sRNAs play by facilitating the study of expression patterns. However, the intricacy of the biological hypotheses coupled with a lack of appropriate tools often leads to inadequate mining of the available data and thus, an incomplete description of the biological mechanisms involved. To enable a comprehensive study of differential expression in sRNA data sets, we present a new interactive pipeline that guides researchers through the various stages of data preprocessing and analysis. This includes various tools, some of which we specifically developed for sRNA analysis, for quality checking and normalization of sRNA samples as well as tools for the detection of differentially expressed sRNAs and identification of the resulting expression patterns. The pipeline is available within the UEA sRNA Workbench, a user-friendly software package for the processing of sRNA data sets. We demonstrate the use of the pipeline on a H. sapiens data set; additional examples on a B. terrestris data set and on an A. thaliana data set are described in the Supplemental Information A comparison with existing approaches is also included, which exemplifies some of the issues that need to be addressed for sRNA analysis and how the new pipeline may be used to do this. © 2017 Beckers et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  19. The effect of an optimized imaging flow cytometry analysis template on sample throughput in the reduced culture cytokinesis-block micronucleus assay

    International Nuclear Information System (INIS)

    Rodrigues, M.A.; Beaton-Green, L.A.; Wilkins, R.C.; Probst, C.E.

    2016-01-01

    In cases of overexposure to ionizing radiation, the cytokinesis-block micronucleus (CBMN) assay can be performed in order to estimate the dose of radiation to an exposed individual. However, in the event of a large-scale radiation accident with many potentially exposed casualties, the assay must be able to generate accurate dose estimates to within ±0.5 Gy as quickly as possible. The assay has been adapted to, validated and optimized on the ImageStream"X imaging flow cyto-meter. The ease of running this automated version of the CBMN assay allowed investigation into the accuracy of dose estimates after reducing the volume of whole blood cultured to 200 μl and reducing the culture time to 48 h. The data analysis template used to identify binucleated lymphocyte cells (BNCs) and micronuclei (MN) has since been optimized to improve the sensitivity and specificity of BNC and MN detection. This paper presents a re-analysis of existing data using this optimized analysis template to demonstrate that dose estimations from blinded samples can be obtained to the same level of accuracy in a shorter data collection time. Here, we show that dose estimates from blinded samples were obtained to within ±0.5 Gy of the delivered dose when data collection time was reduced by 30 min at standard culture conditions and by 15 min at reduced culture conditions. Reducing data collection time while retaining the same level of accuracy in our imaging flow cytometry-based version of the CBMN assay results in higher throughput and further increases the relevancy of the CBMN assay as a radiation bio-dosimeter. (authors)

  20. Library construction and evaluation for site saturation mutagenesis.

    Science.gov (United States)

    Sullivan, Bradford; Walton, Adam Z; Stewart, Jon D

    2013-06-10

    We developed a method for creating and evaluating site-saturation libraries that consistently yields an average of 27.4±3.0 codons of the 32 possible within a pool of 95 transformants. This was verified by sequencing 95 members from 11 independent libraries within the gene encoding alkene reductase OYE 2.6 from Pichia stipitis. Correct PCR primer design as well as a variety of factors that increase transformation efficiency were critical contributors to the method's overall success. We also developed a quantitative analysis of library quality (Q-values) that defines library degeneracy. Q-values can be calculated from standard fluorescence sequencing data (capillary electropherograms) and the degeneracy predicted from an early stage of library construction (pooled plasmids from the initial transformation) closely matched that observed after ca. 1000 library members were sequenced. Based on this experience, we suggest that this analysis can be a useful guide when applying our optimized protocol to new systems, allowing one to focus only on good-quality libraries and reject substandard libraries at an early stage. This advantage is particularly important when lower-throughput screening techniques such as chiral-phase GC must be employed to identify protein variants with desirable properties, e.g., altered stereoselectivities or when multiple codons are targeted for simultaneous randomization. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. DNA-, RNA-, and Protein-Based Stable-Isotope Probing for High-Throughput Biomarker Analysis of Active Microorganisms.

    Science.gov (United States)

    Jameson, Eleanor; Taubert, Martin; Coyotzi, Sara; Chen, Yin; Eyice, Özge; Schäfer, Hendrik; Murrell, J Colin; Neufeld, Josh D; Dumont, Marc G

    2017-01-01

    Stable-isotope probing (SIP) enables researchers to target active populations within complex microbial communities, which is achieved by providing growth substrates enriched in heavy isotopes, usually in the form of 13 C, 18 O, or 15 N. After growth on the substrate and subsequent extraction of microbial biomarkers, typically nucleic acids or proteins, the SIP technique is used for the recovery and analysis of isotope-labeled biomarkers from active microbial populations. In the years following the initial development of DNA- and RNA-based SIP, it was common practice to characterize labeled populations by targeted gene analysis. Such approaches usually involved fingerprint-based analyses or sequencing of clone libraries containing 16S rRNA genes or functional marker gene amplicons. Although molecular fingerprinting remains a valuable approach for rapid confirmation of isotope labeling, recent advances in sequencing technology mean that it is possible to obtain affordable and comprehensive amplicon profiles, metagenomes, or metatranscriptomes from SIP experiments. Not only can the abundance of microbial groups be inferred from metagenomes, but researchers can bin, assemble, and explore individual genomes to build hypotheses about the metabolic capabilities of labeled microorganisms. Analysis of labeled mRNA is a more recent advance that can provide independent metatranscriptome-based analysis of active microorganisms. The power of metatranscriptomics is that mRNA abundance often correlates closely with the corresponding activity of encoded enzymes, thus providing insight into microbial metabolism at the time of sampling. Together, these advances have improved the sensitivity of SIP methods and allow the use of labeled substrates at ecologically relevant concentrations. Particularly as methods improve and costs continue to drop, we expect that the integration of SIP with multiple omics-based methods will become prevalent components of microbial ecology studies

  2. Helios: History and Anatomy of a Successful In-House Enterprise High-Throughput Screening and Profiling Data Analysis System.

    Science.gov (United States)

    Gubler, Hanspeter; Clare, Nicholas; Galafassi, Laurent; Geissler, Uwe; Girod, Michel; Herr, Guy

    2018-06-01

    We describe the main characteristics of the Novartis Helios data analysis software system (Novartis, Basel, Switzerland) for plate-based screening and profiling assays, which was designed and built about 11 years ago. It has been in productive use for more than 10 years and is one of the important standard software applications running for a large user community at all Novartis Institutes for BioMedical Research sites globally. A high degree of automation is reached by embedding the data analysis capabilities into a software ecosystem that deals with the management of samples, plates, and result data files, including automated data loading. The application provides a series of analytical procedures, ranging from very simple to advanced, which can easily be assembled by users in very flexible ways. This also includes the automatic derivation of a large set of quality control (QC) characteristics at every step. Any of the raw, intermediate, and final results and QC-relevant quantities can be easily explored through linked visualizations. Links to global assay metadata management, data warehouses, and an electronic lab notebook system are in place. Automated transfer of relevant data to data warehouses and electronic lab notebook systems are also implemented.

  3. Implementation and Analysis of a Lean Six Sigma Program in Microsurgery to Improve Operative Throughput in Perforator Flap Breast Reconstruction.

    Science.gov (United States)

    Hultman, Charles Scott; Kim, Sendia; Lee, Clara N; Wu, Cindy; Dodge, Becky; Hultman, Chloe Elizabeth; Roach, S Tanner; Halvorson, Eric G

    2016-06-01

    Perforator flaps have become a preferred method of breast reconstruction but can consume considerable resources. We examined the impact of a Six Sigma program on microsurgical breast reconstruction at an academic medical center. Using methods developed by Motorola and General Electric, we applied critical pathway planning, workflow analysis, lean manufacturing, continuous quality improvement, and defect reduction to microsurgical breast reconstruction. Primary goals were to decrease preoperative-to-cut time and total operative time, through reduced variability and improved efficiency. Secondary goals were to reduce length of stay, complications, and reoperation. The project was divided into 3 phases: (1) Pre-Six Sigma (24 months), (2) Six Sigma (10 months), (3) and Post-Six Sigma (24 months). These periods (baseline, intervention, control) were compared by Student t test and χ analysis. Over a 5-year period, 112 patients underwent 168 perforator flaps for breast reconstructions, by experienced microsurgeons. Total operative time decreased from 714 to 607 minutes (P Six Sigma program in microsurgical breast reconstruction was associated with better operational and financial outcomes. These incremental gains were maintained over the course of the study, suggesting that these benefits were due, in part, to process improvements. However, continued reductions in total operative time and length of stay, well after the intervention period, support the possibility that "learning curve" phenomenon may have contributed to the improvement in these outcomes.

  4. Beyond the Natural Proteome: Nondegenerate Saturation Mutagenesis-Methodologies and Advantages.

    Science.gov (United States)

    Ferreira Amaral, M M; Frigotto, L; Hine, A V

    2017-01-01

    Beyond the natural proteome, high-throughput mutagenesis offers the protein engineer an opportunity to "tweak" the wild-type activity of a protein to create a recombinant protein with required attributes. Of the various approaches available, saturation mutagenesis is one of the core techniques employed by protein engineers, and in recent times, nondegenerate saturation mutagenesis is emerging as the approach of choice. This review compares the current methodologies available for conducting nondegenerate saturation mutagenesis with traditional, degenerate saturation and briefly outlines the options available for screening the resulting libraries, to discover a novel protein with the required activity and/or specificity. © 2017 Elsevier Inc. All rights reserved.

  5. Network analysis of the microorganism in 25 Danish wastewater treatment plants over 7 years using high-throughput amplicon sequencing

    DEFF Research Database (Denmark)

    Albertsen, Mads; Larsen, Poul; Saunders, Aaron Marc

    to link sludge and floc properties to the microbial communities. All data was subjected to extensive network analysis and multivariate statistics through R. The 16S amplicon results confirmed the findings of relatively few core groups of organism shared by all the wastewater treatment plants......Wastewater treatment is the world’s largest biotechnological processes and a perfect model system for microbial ecology as the habitat is well defined and replicated all over the world. Extensive investigations on Danish wastewater treatment plants using fluorescent in situ hybridization have...... a year, totaling over 400 samples. All samples were subjected to 16S rDNA amplicon sequencing using V13 primers on the Illumina MiSeq platform (2x300bp) to a depth of at least 20.000 quality filtered reads per sample. The OTUs were assigned taxonomy based on a manually curated version of the greengenes...

  6. iMS2Flux – a high–throughput processing tool for stable isotope labeled mass spectrometric data used for metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Poskar C Hart

    2012-11-01

    Full Text Available Abstract Background Metabolic flux analysis has become an established method in systems biology and functional genomics. The most common approach for determining intracellular metabolic fluxes is to utilize mass spectrometry in combination with stable isotope labeling experiments. However, before the mass spectrometric data can be used it has to be corrected for biases caused by naturally occurring stable isotopes, by the analytical technique(s employed, or by the biological sample itself. Finally the MS data and the labeling information it contains have to be assembled into a data format usable by flux analysis software (of which several dedicated packages exist. Currently the processing of mass spectrometric data is time-consuming and error-prone requiring peak by peak cut-and-paste analysis and manual curation. In order to facilitate high-throughput metabolic flux analysis, the automation of multiple steps in the analytical workflow is necessary. Results Here we describe iMS2Flux, software developed to automate, standardize and connect the data flow between mass spectrometric measurements and flux analysis programs. This tool streamlines the transfer of data from extraction via correction tools to 13C-Flux software by processing MS data from stable isotope labeling experiments. It allows the correction of large and heterogeneous MS datasets for the presence of naturally occurring stable isotopes, initial biomass and several mass spectrometry effects. Before and after data correction, several checks can be performed to ensure accurate data. The corrected data may be returned in a variety of formats including those used by metabolic flux analysis software such as 13CFLUX, OpenFLUX and 13CFLUX2. Conclusion iMS2Flux is a versatile, easy to use tool for the automated processing of mass spectrometric data containing isotope labeling information. It represents the core framework for a standardized workflow and data processing. Due to its flexibility

  7. High throughput analysis reveals dissociable gene expression profiles in two independent neural systems involved in the regulation of social behavior

    Directory of Open Access Journals (Sweden)

    Stevenson Tyler J

    2012-10-01

    Full Text Available Abstract Background Production of contextually appropriate social behaviors involves integrated activity across many brain regions. Many songbird species produce complex vocalizations called ‘songs’ that serve to attract potential mates, defend territories, and/or maintain flock cohesion. There are a series of discrete interconnect brain regions that are essential for the successful production of song. The probability and intensity of singing behavior is influenced by the reproductive state. The objectives of this study were to examine the broad changes in gene expression in brain regions that control song production with a brain region that governs the reproductive state. Results We show using microarray cDNA analysis that two discrete brain systems that are both involved in governing singing behavior show markedly different gene expression profiles. We found that cortical and basal ganglia-like brain regions that control the socio-motor production of song in birds exhibit a categorical switch in gene expression that was dependent on their reproductive state. This pattern is in stark contrast to the pattern of expression observed in a hypothalamic brain region that governs the neuroendocrine control of reproduction. Subsequent gene ontology analysis revealed marked variation in the functional categories of active genes dependent on reproductive state and anatomical localization. HVC, one cortical-like structure, displayed significant gene expression changes associated with microtubule and neurofilament cytoskeleton organization, MAP kinase activity, and steroid hormone receptor complex activity. The transitions observed in the preoptic area, a nucleus that governs the motivation to engage in singing, exhibited variation in functional categories that included thyroid hormone receptor activity, epigenetic and angiogenetic processes. Conclusions These findings highlight the importance of considering the temporal patterns of gene expression

  8. Towards understanding of magnetization reversal in Nd-Fe-B nanocomposites: analysis by high-throughput micromagnetic simulations

    Science.gov (United States)

    Erokhin, Sergey; Berkov, Dmitry; Ito, Masaaki; Kato, Akira; Yano, Masao; Michels, Andreas

    2018-03-01

    We demonstrate how micromagnetic simulations can be employed in order to characterize and analyze the magnetic microstructure of nanocomposites. For the example of nanocrystalline Nd-Fe-B, which is a potential material for future permanent-magnet applications, we have compared three different models for the micromagnetic analysis of this material class: (i) a description of the nanocomposite microstructure in terms of Stoner-Wohlfarth particles with and without the magnetodipolar interaction; (ii) a model based on the core-shell representation of the nanograins; (iii) the latter model including a contribution of superparamagnetic clusters. The relevant parameter spaces have been systematically scanned with the aim to establish which micromagnetic approach can most adequately describe experimental data for this material. According to our results, only the last, most sophisticated model is able to provide an excellent agreement with the measured hysteresis loop. The presented methodology is generally applicable to multiphase magnetic nanocomposites and it highligths the complex interrelationship between the microstructure, magnetic interactions, and the macroscopic magnetic properties.

  9. High-throughput analysis by SP-LDI-MS for fast identification of adulterations in commercial balsamic vinegars

    Energy Technology Data Exchange (ETDEWEB)

    Guerreiro, Tatiane Melina; Oliveira, Diogo Noin de; Ferreira, Mônica Siqueira; Catharino, Rodrigo Ramos, E-mail: rrc@fcm.unicamp.br

    2014-08-01

    Highlights: • Rapid identification of adulteration in balsamic vinegars. • Minimal sample preparation. • No matrix required for assisting laser desorption/ionization. • Fast sample discrimination by multivariate data analysis. - Abstract: Balsamic vinegar (BV) is a typical and valuable Italian product, worldwide appreciated thanks to its characteristic flavors and potential health benefits. Several studies have been conducted to assess physicochemical and microbial compositions of BV, as well as its beneficial properties. Due to highly-disseminated claims of antioxidant, antihypertensive and antiglycemic properties, BV is a known target for frauds and adulterations. For that matter, product authentication, certifying its origin (region or country) and thus the processing conditions, is becoming a growing concern. Striving for fraud reduction as well as quality and safety assurance, reliable analytical strategies to rapidly evaluate BV quality are very interesting, also from an economical point of view. This work employs silica plate laser desorption/ionization mass spectrometry (SP-LDI-MS) for fast chemical profiling of commercial BV samples with protected geographical indication (PGI) and identification of its adulterated samples with low-priced vinegars, namely apple, alcohol and red/white wines.

  10. High-throughput analysis by SP-LDI-MS for fast identification of adulterations in commercial balsamic vinegars

    International Nuclear Information System (INIS)

    Guerreiro, Tatiane Melina; Oliveira, Diogo Noin de; Ferreira, Mônica Siqueira; Catharino, Rodrigo Ramos

    2014-01-01

    Highlights: • Rapid identification of adulteration in balsamic vinegars. • Minimal sample preparation. • No matrix required for assisting laser desorption/ionization. • Fast sample discrimination by multivariate data analysis. - Abstract: Balsamic vinegar (BV) is a typical and valuable Italian product, worldwide appreciated thanks to its characteristic flavors and potential health benefits. Several studies have been conducted to assess physicochemical and microbial compositions of BV, as well as its beneficial properties. Due to highly-disseminated claims of antioxidant, antihypertensive and antiglycemic properties, BV is a known target for frauds and adulterations. For that matter, product authentication, certifying its origin (region or country) and thus the processing conditions, is becoming a growing concern. Striving for fraud reduction as well as quality and safety assurance, reliable analytical strategies to rapidly evaluate BV quality are very interesting, also from an economical point of view. This work employs silica plate laser desorption/ionization mass spectrometry (SP-LDI-MS) for fast chemical profiling of commercial BV samples with protected geographical indication (PGI) and identification of its adulterated samples with low-priced vinegars, namely apple, alcohol and red/white wines

  11. Highly Sensitive and High-Throughput Method for the Analysis of Bisphenol Analogues and Their Halogenated Derivatives in Breast Milk.

    Science.gov (United States)

    Niu, Yumin; Wang, Bin; Zhao, Yunfeng; Zhang, Jing; Shao, Bing

    2017-12-06

    The structural analogs of bisphenol A (BPA) and their halogenated derivatives (together termed BPs) have been found in the environment, food, and even the human body. Limited research showed that some of them exhibited toxicities that were similar to or even greater than that of BPA. Therefore, adverse health effects for BPs were expected for humans with low-dose exposure in early life. Breast milk is an excellent matrix and could reflect fetuses' and babies' exposure to contaminants. Some of the emerging BPs may present with trace or ultratrace levels in humans. However, existing analytical methods for breast milk cannot quantify these BPs simultaneously with high sensitivity using a small sampling weight, which is important for human biomonitoring studies. In this paper, a method based on Bond Elut Enhanced Matrix Removal-Lipid purification, pyridine-3-sulfonyl chloride derivatization, and liquid chromatography electrospray tandem mass spectrometry was developed. The method requires only a small quantity of sample (200 μL) and allowed for the simultaneous determination of 24 BPs in breast milk with ultrahigh sensitivity. The limits of quantitation of the proposed method were 0.001-0.200 μg L -1 , which were 1-6.7 times lower than the only study for the simultaneous analysis of bisphenol analogs in breast milk based on a 3 g sample weight. The mean recoveries ranged from 86.11% to 119.05% with relative standard deviation (RSD) ≤ 19.5% (n = 6). Matrix effects were within 20% with RSD bisphenol F (BPF), bisphenol S (BPS), and bisphenol AF (BPAF) were detected. BPA was still the dominant BP, followed by BPF. This is the first report describing the occurrence of BPF and BPAF in breast milk.

  12. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    Science.gov (United States)

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without

  13. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  14. Studies of non-isothermal flow in saturated and partially saturated porous media

    International Nuclear Information System (INIS)

    Ho, C.K.; Maki, K.S.; Glass, R.J.

    1993-01-01

    Physical and numerical experiments have been performed to investigate the behavior of nonisothermal flow in two-dimensional saturated and partially saturated porous media. The physical experiments were performed to identify non-isothermal flow fields and temperature distributions in fully saturated, half-saturated, and residually saturated two-dimensional porous media with bottom heating and top cooling. Two counter-rotating liquid-phase convective cells were observed to develop in the saturated regions of all three cases. Gas-phase convection was also evidenced in the unsaturated regions of the partially saturated experiments. TOUGH2 numerical simulations of the saturated case were found to be strongly dependent on the assumed boundary conditions of the physical system. Models including heat losses through the boundaries of the test cell produced temperature and flow fields that were in better agreement with the observed temperature and flow fields than models that assumed insulated boundary conditions. A sensitivity analysis also showed that a reduction of the bulk permeability of the porous media in the numerical simulations depressed the effects of convection, flattening the temperature profiles across the test cell

  15. Third Generation (3G) Site Characterization: Cryogenic Core Collection and High Throughput Core Analysis - An Addendum to Basic Research Addressing Contaminants in Low Permeability Zones - A State of the Science Review

    Science.gov (United States)

    2016-07-29

    Styrofoam insulation for keeping the core frozen during MRI .................................. 78 Figure 5-2. Schematic of reference and core setting in... Hollow -Stem Auger HTCA High-Throughput Core Analysis IC Ion Chromatograph ID Inner Diameter k Permeability LN Liquid Nitrogen LNAPL Light...vibration, or “over drilling” using a hollow -stem auger. The ratio of the length of the collected core to the depth over which the sample tube is

  16. Diagnostic throughput factor analysis for en-route airspace and optimal aircraft trajectory generation based on capacity prediction and controller workload

    Science.gov (United States)

    Shin, Sanghyun

    Today's National Airspace System (NAS) is approaching its limit to efficiently cope with the increasing air traffic demand. Next Generation Air Transportation System (NextGen) with its ambitious goals aims to make the air travel more predictable with fewer delays, less time sitting on the ground and holding in the air to improve the performance of the NAS. However, currently the performance of the NAS is mostly measured using delay-based metrics which do not capture a whole range of important factors that determine the quality and level of utilization of the NAS. The factors affecting the performance of the NAS are themselves not well defined to begin with. To address these issues, motivated by the use of throughput-based metrics in many areas such as ground transportation, wireless communication and manufacturing, this thesis identifies the different factors which majorly affect the performance of the NAS as demand (split into flight cancellation and flight rerouting), safe separation (split into conflict and metering) and weather (studied as convective weather) through careful comparison with other applications and performing empirical sensitivity analysis. Additionally, the effects of different factors on the NAS's performance are quantitatively studied using real traffic data with the Future ATM Concepts Evaluation Tool (FACET) for various sectors and centers of the NAS on different days. In this thesis we propose a diagnostic tool which can analyze the factors that have greater responsibility for regions of poor and better performances of the NAS. Based on the throughput factor analysis for en-route airspace, it was found that weather and controller workload are the major factors that decrease the efficiency of the airspace. Also, since resources such as air traffic controllers, infrastructure and airspace are limited, it is becoming increasingly important to use the available resources efficiently. To alleviate the impact of the weather and controller

  17. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    Science.gov (United States)

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  18. Analysis of the effects of five factors relevant to in vitro chondrogenesis of human mesenchymal stem cells using factorial design and high throughput mRNA-profiling.

    Science.gov (United States)

    Jakobsen, Rune B; Østrup, Esben; Zhang, Xiaolan; Mikkelsen, Tarjei S; Brinchmann, Jan E

    2014-01-01

    The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols.

  19. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data [version 3; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Damien Correia

    2016-12-01

    Full Text Available The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS or Next-Generation Sequencing (NGS technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS, solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power. Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration

  20. AlphaScreen-based homogeneous assay using a pair of 25-residue artificial proteins for high-throughput analysis of non-native IgG.

    Science.gov (United States)

    Senga, Yukako; Imamura, Hiroshi; Miyafusa, Takamitsu; Watanabe, Hideki; Honda, Shinya

    2017-09-29

    Therapeutic IgG becomes unstable under various stresses in the manufacturing process. The resulting non-native IgG molecules tend to associate with each other and form aggregates. Because such aggregates not only decrease the pharmacological effect but also become a potential risk factor for immunogenicity, rapid analysis of aggregation is required for quality control of therapeutic IgG. In this study, we developed a homogeneous assay using AlphaScreen and AF.2A1. AF.2A1 is a 25-residue artificial protein that binds specifically to non-native IgG generated under chemical and physical stresses. This assay is performed in a short period of time. Our results show that AF.2A1-AlphaScreen may be used to evaluate the various types of IgG, as AF.2A1 recognizes the non-native structure in the constant region (Fc region) of IgG. The assay was effective for detection of non-native IgG, with particle size up to ca. 500 nm, generated under acid, heat, and stirring conditions. In addition, this technique is suitable for analyzing non-native IgG in CHO cell culture supernatant and mixed with large amounts of native IgG. These results indicate the potential of AF.2A1-AlphaScreen to be used as a high-throughput evaluation method for process monitoring as well as quality testing in the manufacturing of therapeutic IgG.

  1. Mapping whole-brain activity with cellular resolution by light-sheet microscopy and high-throughput image analysis (Conference Presentation)

    Science.gov (United States)

    Silvestri, Ludovico; Rudinskiy, Nikita; Paciscopi, Marco; Müllenbroich, Marie Caroline; Costantini, Irene; Sacconi, Leonardo; Frasconi, Paolo; Hyman, Bradley T.; Pavone, Francesco S.

    2016-03-01

    Mapping neuronal activity patterns across the whole brain with cellular resolution is a challenging task for state-of-the-art imaging methods. Indeed, despite a number of technological efforts, quantitative cellular-resolution activation maps of the whole brain have not yet been obtained. Many techniques are limited by coarse resolution or by a narrow field of view. High-throughput imaging methods, such as light sheet microscopy, can be used to image large specimens with high resolution and in reasonable times. However, the bottleneck is then moved from image acquisition to image analysis, since many TeraBytes of data have to be processed to extract meaningful information. Here, we present a full experimental pipeline to quantify neuronal activity in the entire mouse brain with cellular resolution, based on a combination of genetics, optics and computer science. We used a transgenic mouse strain (Arc-dVenus mouse) in which neurons which have been active in the last hours before brain fixation are fluorescently labelled. Samples were cleared with CLARITY and imaged with a custom-made confocal light sheet microscope. To perform an automatic localization of fluorescent cells on the large images produced, we used a novel computational approach called semantic deconvolution. The combined approach presented here allows quantifying the amount of Arc-expressing neurons throughout the whole mouse brain. When applied to cohorts of mice subject to different stimuli and/or environmental conditions, this method helps finding correlations in activity between different neuronal populations, opening the possibility to infer a sort of brain-wide 'functional connectivity' with cellular resolution.

  2. Analysis of the Effects of Five Factors Relevant to In Vitro Chondrogenesis of Human Mesenchymal Stem Cells Using Factorial Design and High Throughput mRNA-Profiling

    Science.gov (United States)

    Jakobsen, Rune B.; Østrup, Esben; Zhang, Xiaolan; Mikkelsen, Tarjei S.; Brinchmann, Jan E.

    2014-01-01

    The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols. PMID:24816923

  3. Analysis of the effects of five factors relevant to in vitro chondrogenesis of human mesenchymal stem cells using factorial design and high throughput mRNA-profiling.

    Directory of Open Access Journals (Sweden)

    Rune B Jakobsen

    Full Text Available The in vitro process of chondrogenic differentiation of mesenchymal stem cells for tissue engineering has been shown to require three-dimensional culture along with the addition of differentiation factors to the culture medium. In general, this leads to a phenotype lacking some of the cardinal features of native articular chondrocytes and their extracellular matrix. The factors used vary, but regularly include members of the transforming growth factor β superfamily and dexamethasone, sometimes in conjunction with fibroblast growth factor 2 and insulin-like growth factor 1, however the use of soluble factors to induce chondrogenesis has largely been studied on a single factor basis. In the present study we combined a factorial quality-by-design experiment with high-throughput mRNA profiling of a customized chondrogenesis related gene set as a tool to study in vitro chondrogenesis of human bone marrow derived mesenchymal stem cells in alginate. 48 different conditions of transforming growth factor β 1, 2 and 3, bone morphogenetic protein 2, 4 and 6, dexamethasone, insulin-like growth factor 1, fibroblast growth factor 2 and cell seeding density were included in the experiment. The analysis revealed that the best of the tested differentiation cocktails included transforming growth factor β 1 and dexamethasone. Dexamethasone acted in synergy with transforming growth factor β 1 by increasing many chondrogenic markers while directly downregulating expression of the pro-osteogenic gene osteocalcin. However, all factors beneficial to the expression of desirable hyaline cartilage markers also induced undesirable molecules, indicating that perfect chondrogenic differentiation is not achievable with the current differentiation protocols.

  4. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    Science.gov (United States)

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015

  5. Metabolomic and Lipidomic Analysis of Serum Samples following Curcuma longa Extract Supplementation in High-Fructose and Saturated Fat Fed Rats.

    Science.gov (United States)

    Tranchida, Fabrice; Shintu, Laetitia; Rakotoniaina, Zo; Tchiakpe, Léopold; Deyris, Valérie; Hiol, Abel; Caldarelli, Stefano

    2015-01-01

    We explored, using nuclear magnetic resonance (NMR) metabolomics and fatty acids profiling, the effects of a common nutritional complement, Curcuma longa, at a nutritionally relevant dose with human use, administered in conjunction with an unbalanced diet. Indeed, traditional food supplements have been long used to counter metabolic impairments induced by unbalanced diets. Here, rats were fed either a standard diet, a high level of fructose and saturated fatty acid (HFS) diet, a diet common to western countries and that certainly contributes to the epidemic of insulin resistance (IR) syndrome, or a HFS diet with a Curcuma longa extract (1% of curcuminoids in the extract) for ten weeks. Orthogonal projections to latent structures discriminant analysis (OPLS-DA) on the serum NMR profiles and fatty acid composition (determined by GC/MS) showed a clear discrimination between HFS groups and controls. This discrimination involved metabolites such as glucose, amino acids, pyruvate, creatine, phosphocholine/glycerophosphocholine, ketone bodies and glycoproteins as well as an increase of monounsaturated fatty acids (MUFAs) and a decrease of n-6 and n-3 polyunsaturated fatty acids (PUFAs). Although the administration of Curcuma longa did not prevent the observed increase of glucose, triglycerides, cholesterol and insulin levels, discriminating metabolites were observed between groups fed HFS alone or with addition of a Curcuma longa extract, namely some MUFA and n-3 PUFA, glycoproteins, glutamine, and methanol, suggesting that curcuminoids may act respectively on the fatty acid metabolism, the hexosamine biosynthesis pathway and alcohol oxidation. Curcuma longa extract supplementation appears to be beneficial in these metabolic pathways in rats. This metabolomic approach highlights important serum metabolites that could help in understanding further the metabolic mechanisms leading to IR.

  6. The association between lactate, mean arterial pressure, central venous oxygen saturation and peripheral temperature and mortality in severe sepsis: a retrospective cohort analysis.

    Science.gov (United States)

    Houwink, Aletta P I; Rijkenberg, Saskia; Bosman, Rob J; van der Voort, Peter H J

    2016-03-12

    During resuscitation in severe sepsis and septic shock, several goals are set. However, usually not all goals are equally met. The aim of this study is to determine the relative importance of the different goals, such as mean arterial pressure (MAP), lactate, central venous oxygen saturation (ScvO2) and central to forefoot temperature (delta-T), and how they relate to intensive care unit (ICU) and hospital mortality. In a retrospective cohort study in a 20-bed mixed medical and surgical ICU of a teaching hospital we studied consecutive critically ill patients who were admitted for confirmed infection and severe sepsis or septic shock between 2008 and 2014. All validated MAP, lactate levels, ScvO2 and delta-T for the first 24 hours of ICU treatment were extracted from a clinical database. Logistic regression analyses were performed on validated measurements in the first hour after admission and on mean values over 24 hours. Patients were categorized by MAP (24-hour mean below or above 65 mmHg) and lactate (24-hour mean below or above 2 mmol/l) for Cox regression analysis. From 837 patients, 821 were eligible for analysis. All had MAP and lactate measurements. The delta-T was available in 812 (99%) and ScvO2 was available for 193 out of these patients (23.5%). Admission lactate (p < 0.001) and admission MAP (p < 0.001) were independent predictors of ICU and hospital mortality. The 24-hour mean values for lactate, MAP and delta-T were all independent predictors of ICU mortality. Hospital mortality was independently predicted by the 24-hour mean lactate (odds ratio (OR) 1.34, 95% confidence interval (CI) 1.30-1.40, p = 0.001) mean MAP (OR 0.96, 95% CI 0.95-0.97, p = 0.001) and mean delta-T (OR 1.09, 95% CI 1.06-1.12, p = 0.001). Patients with a 24-hour mean lactate below 2 mmol/l and a 24-hour mean MAP above 65 mmHg had the best survival, followed by patients with a low lactate and a low MAP. Admission MAP and lactate independently predicted ICU and hospital mortality

  7. SATURATED ZONE IN-SITU TESTING

    Energy Technology Data Exchange (ETDEWEB)

    P.W. REIMUS

    2004-11-08

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain, Nevada. The test interpretations provide estimates of flow and transport parameters used in the development of parameter distributions for total system performance assessment (TSPA) calculations. These parameter distributions are documented in ''Site-Scale Saturated Zone Flow Model (BSC 2004 [DIRS 170037]), Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]), Saturated Zone Colloid Transport (BSC 2004 [DIRS 170006]), and ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, this scientific analysis contributes the following to the assessment of the capability of the SZ to serve as part of a natural barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvial Testing Complex (ATC) located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass

  8. SATURATED ZONE IN-SITU TESTING

    International Nuclear Information System (INIS)

    REIMUS, P.W.

    2004-01-01

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain, Nevada. The test interpretations provide estimates of flow and transport parameters used in the development of parameter distributions for total system performance assessment (TSPA) calculations. These parameter distributions are documented in ''Site-Scale Saturated Zone Flow Model (BSC 2004 [DIRS 170037]), Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]), Saturated Zone Colloid Transport (BSC 2004 [DIRS 170006]), and ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, this scientific analysis contributes the following to the assessment of the capability of the SZ to serve as part of a natural barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvial Testing Complex (ATC) located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and colloid

  9. Improvement of High-throughput Genotype Analysis After Implementation of a Dual-curve Sybr Green I-based Quantification and Normalization Procedure

    Science.gov (United States)

    The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...

  10. Saturated Zone Colloid-Facilitated Transport

    International Nuclear Information System (INIS)

    Wolfsberg, A.; Reimus, P.

    2001-01-01

    The purpose of the Saturated Zone Colloid-Facilitated Transport Analysis and Modeling Report (AMR), as outlined in its Work Direction and Planning Document (CRWMS MandO 1999a), is to provide retardation factors for colloids with irreversibly-attached radionuclides, such as plutonium, in the saturated zone (SZ) between their point of entrance from the unsaturated zone (UZ) and downgradient compliance points. Although it is not exclusive to any particular radionuclide release scenario, this AMR especially addresses those scenarios pertaining to evidence from waste degradation experiments, which indicate that plutonium and perhaps other radionuclides may be irreversibly attached to colloids. This report establishes the requirements and elements of the design of a methodology for calculating colloid transport in the saturated zone at Yucca Mountain. In previous Total Systems Performance Assessment (TSPA) analyses, radionuclide-bearing colloids were assumed to be unretarded in their migration. Field experiments in fractured tuff at Yucca Mountain and in porous media at other sites indicate that colloids may, in fact, experience retardation relative to the mean pore-water velocity, suggesting that contaminants associated with colloids should also experience some retardation. Therefore, this analysis incorporates field data where available and a theoretical framework when site-specific data are not available for estimating plausible ranges of retardation factors in both saturated fractured tuff and saturated alluvium. The distribution of retardation factors for tuff and alluvium are developed in a form consistent with the Performance Assessment (PA) analysis framework for simulating radionuclide transport in the saturated zone. To improve on the work performed so far for the saturated-zone flow and transport modeling, concerted effort has been made in quantifying colloid retardation factors in both fractured tuff and alluvium. The fractured tuff analysis used recent data

  11. Identification of miRNAs and their targets through high-throughput sequencing and degradome analysis in male and female Asparagus officinalis.

    Science.gov (United States)

    Chen, Jingli; Zheng, Yi; Qin, Li; Wang, Yan; Chen, Lifei; He, Yanjun; Fei, Zhangjun; Lu, Gang

    2016-04-12

    MicroRNAs (miRNAs), a class of non-coding small RNAs (sRNAs), regulate various biological processes. Although miRNAs have been identified and characterized in several plant species, miRNAs in Asparagus officinalis have not been reported. As a dioecious plant with homomorphic sex chromosomes, asparagus is regarded as an important model system for studying mechanisms of plant sex determination. Two independent sRNA libraries from male and female asparagus plants were sequenced with Illumina sequencing, thereby generating 4.13 and 5.88 million final clean reads, respectively. Both libraries predominantly contained 24-nt sRNAs, followed by 21-nt sRNAs. Further analysis identified 154 conserved miRNAs, which belong to 26 families, and 39 novel miRNA candidates seemed to be specific to asparagus. Comparative profiling revealed that 63 miRNAs exhibited significant differential expression between male and female plants, which was confirmed by real-time quantitative PCR analysis. Among them, 37 miRNAs were significantly up-regulated in the female library, whereas the others were preferentially expressed in the male library. Furthermore, 40 target mRNAs representing 44 conserved and seven novel miRNAs were identified in asparagus through high-throughput degradome sequencing. Functional annotation showed that these target mRNAs were involved in a wide range of developmental and metabolic processes. We identified a large set of conserved and specific miRNAs and compared their expression levels between male and female asparagus plants. Several asparagus miRNAs, which belong to the miR159, miR167, and miR172 families involved in reproductive organ development, were differentially expressed between male and female plants, as well as during flower development. Consistently, several predicted targets of asparagus miRNAs were associated with floral organ development. These findings suggest the potential roles of miRNAs in sex determination and reproductive developmental processes in

  12. Comparative analysis of miRNAs of two rapeseed genotypes in response to acetohydroxyacid synthase-inhibiting herbicides by high-throughput sequencing.

    Directory of Open Access Journals (Sweden)

    Maolong Hu

    Full Text Available Acetohydroxyacid synthase (AHAS, also called acetolactate synthase, is a key enzyme involved in the first step of the biosynthesis of the branched-chain amino acids valine, isoleucine and leucine. Acetohydroxyacid synthase-inhibiting herbicides (AHAS herbicides are five chemical families of herbicides that inhibit AHAS enzymes, including imidazolinones (IMI, sulfonylureas (SU, pyrimidinylthiobenzoates, triazolinones and triazolopyrimidines. Five AHAS genes have been identified in rapeseed, but little information is available regarding the role of miRNAs in response to AHAS herbicides. In this study, an AHAS herbicides tolerant genotype and a sensitive genotype were used for miRNA comparative analysis. A total of 20 small RNA libraries were obtained of these two genotypes at three time points (0h, 24 h and 48 h after spraying SU and IMI herbicides with two replicates. We identified 940 conserved miRNAs and 1515 novel candidate miRNAs in Brassica napus using high-throughput sequencing methods combined with computing analysis. A total of 3284 genes were predicted to be targets of these miRNAs, and their functions were shown using GO, KOG and KEGG annotations. The differentiation expression results of miRNAs showed almost twice as many differentiated miRNAs were found in tolerant genotype M342 (309 miRNAs after SU herbicide application than in sensitive genotype N131 (164 miRNAs. In additiond 177 and 296 miRNAs defined as differentiated in sensitive genotype and tolerant genotype in response to SU herbicides. The miR398 family was observed to be associated with AHAS herbicide tolerance because their expression increased in the tolerant genotype but decreased in the sensitive genotype. Moreover, 50 novel miRNAs from 39 precursors were predicted. There were 8 conserved miRNAs, 4 novel miRNAs and 3 target genes were validated by quantitative real-time PCR experiment. This study not only provides novel insights into the miRNA content of AHAS herbicides

  13. Development of a fast isocratic LC-MS/MS method for the high-throughput analysis of pyrrolizidine alkaloids in Australian honey.

    Science.gov (United States)

    Griffin, Caroline T; Mitrovic, Simon M; Danaher, Martin; Furey, Ambrose

    2015-01-01

    Honey samples originating from Australia were purchased and analysed for targeted pyrrolizidine alkaloids (PAs) using a new and rapid isocratic LC-MS/MS method. This isocratic method was developed from, and is comparable with, a gradient elution method and resulted in no loss of sensitivity or reduction in chromatographic peak shape. Isocratic elution allows for significantly shorter run times (6 min), eliminates the requirement for column equilibration periods and, thus, has the advantage of facilitating a high-throughput analysis which is particularly important for regulatory testing laboratories. In excess of two hundred injections are possible, with this new isocratic methodology, within a 24-h period which is more than 50% improvement on all previously published methodologies. Good linear calibrations were obtained for all 10 PAs and four PA N-oxides (PANOs) in spiked honey samples (3.57-357.14 µg l(-1); R(2) ≥ 0.9987). Acceptable inter-day repeatability was achieved for the target analytes in honey with % RSD values (n = 4) less than 7.4%. Limits of detection (LOD) and limits of quantitation (LOQ) were achieved with spiked PAs and PANOs samples; giving an average LOD of 1.6 µg kg(-1) and LOQ of 5.4 µg kg(-1). This method was successfully applied to Australian and New Zealand honey samples sourced from supermarkets in Australia. Analysis showed that 41 of the 59 honey samples were contaminated by PAs with the mean total sum of PAs being 153 µg kg(-1). Echimidine and lycopsamine were predominant and found in 76% and 88%, respectively, of the positive samples. The average daily exposure, based on the results presented in this study, were 0.051 µg kg(-1) bw day(-1) for adults and 0.204 µg kg(-1) bw day(-1) for children. These results are a cause for concern when compared with the proposed European Food Safety Authority (EFSA), Committee on Toxicity (COT) and Bundesinstitut für Risikobewertung (BfR - Federal Institute of Risk Assessment Germany) maximum

  14. Space Charge Saturated Sheath Regime and Electron Temperature Saturation in Hall Thrusters

    International Nuclear Information System (INIS)

    Raitses, Y.; Staack, D.; Smirnov, A.; Fisch, N.J.

    2005-01-01

    Secondary electron emission in Hall thrusters is predicted to lead to space charge saturated wall sheaths resulting in enhanced power losses in the thruster channel. Analysis of experimentally obtained electron-wall collision frequency suggests that the electron temperature saturation, which occurs at high discharge voltages, appears to be caused by a decrease of the Joule heating rather than by the enhancement of the electron energy loss at the walls due to a strong secondary electron emission

  15. nitrogen saturation in stream ecosystems

    OpenAIRE

    Earl, S. R.; Valett, H. M.; Webster, J. R.

    2006-01-01

    The concept of nitrogen (N) saturation has organized the assessment of N loading in terrestrial ecosystems. Here we extend the concept to lotic ecosystems by coupling Michaelis-Menten kinetics and nutrient spiraling. We propose a series of saturation response types, which may be used to characterize the proximity of streams to N saturation. We conducted a series of short-term N releases using a tracer ((NO3)-N-15-N) to measure uptake. Experiments were conducted in streams spanning a gradient ...

  16. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  17. Identification of microRNAs in Caragana intermedia by high-throughput sequencing and expression analysis of 12 microRNAs and their targets under salt stress.

    Science.gov (United States)

    Zhu, Jianfeng; Li, Wanfeng; Yang, Wenhua; Qi, Liwang; Han, Suying

    2013-09-01

    142 miRNAs were identified and 38 miRNA targets were predicted, 4 of which were validated, in C. intermedia . The expression of 12 miRNAs in salt-stressed leaves was assessed by qRT-PCR. MicroRNAs (miRNAs) are endogenous small RNAs that play important roles in various biological and metabolic processes in plants. Caragana intermedia is an important ecological and economic tree species prominent in the desert environment of west and northwest China. To date, no investigation into C. intermedia miRNAs has been reported. In this study, high-throughput sequencing of small RNAs and analysis of transcriptome data were performed to identify both conserved and novel miRNAs, and also their target mRNA genes in C. intermedia. Based on sequence similarity and hairpin structure prediction, 132 putative conserved miRNAs (12 of which were confirmed to form hairpin precursors) belonging to 31 known miRNA families were identified. Ten novel miRNAs (including the miRNA* sequences of three novel miRNAs) were also discovered. Furthermore, 36 potential target genes of 17 known miRNA families and 2 potential target genes of 1 novel miRNA were predicted; 4 of these were validated by 5' RACE. The expression of 12 miRNAs was validated in different tissues, and these and five target mRNAs were assessed by qRT-PCR after salt treatment. The expression levels of seven miRNAs (cin-miR157a, cin-miR159a, cin-miR165a, cin-miR167b, cin-miR172b, cin-miR390a and cin-miR396a) were upregulated, while cin-miR398a expression was downregulated after salt treatment. The targets of cin-miR157a, cin-miR165a, cin-miR172b and cin-miR396a were downregulated and showed an approximately negative correlation with their corresponding miRNAs under salt treatment. These results would help further understanding of miRNA regulation in response to abiotic stress in C. intermedia.

  18. Saturation at Low X and Nonlinear Evolution

    International Nuclear Information System (INIS)

    Stasto, A.M.

    2002-01-01

    In this talk the results of the analytical and numerical analysis of the nonlinear Balitsky-Kovchegov equation are presented. The characteristic BFKL diffusion into infrared regime is suppressed by the generation of the saturation scale Q s . We identify the scaling and linear regimes for the solution. We also study the impact of subleading corrections onto the nonlinear evolution. (author)

  19. Saturation and linear transport equation

    International Nuclear Information System (INIS)

    Kutak, K.

    2009-03-01

    We show that the GBW saturation model provides an exact solution to the one dimensional linear transport equation. We also show that it is motivated by the BK equation considered in the saturated regime when the diffusion and the splitting term in the diffusive approximation are balanced by the nonlinear term. (orig.)

  20. MXIbus data throughput tests

    International Nuclear Information System (INIS)

    Botlo, M.; Dunning, J.; Jagieski, M.; Miller, L.; Romero, A.

    1992-11-01

    A series of tests were conducted to evaluate data transfer rates using the MXIbus architecture. The tests were conducted by the DAQ group in the Physics Research Division. The MXIbus from National Instruments provides a multisystem extension interface bus. It allows multiple VME chassis to be networked. Other bus architectures that can participate in the network include VXIbus, IBM PC-AT bus, Sun Sbus, Mac NuBus and stand-alone instruments with the appropriate MXIbus adapter cards. From a functional standpoint the MXIbus provides the capability to enlarge the address space in a fashion that is transparent to the software application. The tests were designed to measure data throughput when using the MSIbus with other industry off-the-shelf hardware. This report contains discussions on: MXIbus architecture and general guidelines; the commercial hardware and software used in each set of tests; and a brief description of each set of tests, observations and guidelines; the commercial hardware and software used in each set of tests; and a brief description of each set of tests, observations and conclusions

  1. High-throughput sequencing and analysis of the gill tissue transcriptome from the deep-sea hydrothermal vent mussel Bathymodiolus azoricus

    Directory of Open Access Journals (Sweden)

    Gomes Paula

    2010-10-01

    Full Text Available Abstract Background Bathymodiolus azoricus is a deep-sea hydrothermal vent mussel found in association with large faunal communities living in chemosynthetic environments at the bottom of the sea floor near the Azores Islands. Investigation of the exceptional physiological reactions that vent mussels have adopted in their habitat, including responses to environmental microbes, remains a difficult challenge for deep-sea biologists. In an attempt to reveal genes potentially involved in the deep-sea mussel innate immunity we carried out a high-throughput sequence analysis of freshly collected B. azoricus transcriptome using gills tissues as the primary source of immune transcripts given its strategic role in filtering the surrounding waterborne potentially infectious microorganisms. Additionally, a substantial EST data set was produced and from which a comprehensive collection of genes coding for putative proteins was organized in a dedicated database, "DeepSeaVent" the first deep-sea vent animal transcriptome database based on the 454 pyrosequencing technology. Results A normalized cDNA library from gills tissue was sequenced in a full 454 GS-FLX run, producing 778,996 sequencing reads. Assembly of the high quality reads resulted in 75,407 contigs of which 3,071 were singletons. A total of 39,425 transcripts were conceptually translated into amino-sequences of which 22,023 matched known proteins in the NCBI non-redundant protein database, 15,839 revealed conserved protein domains through InterPro functional classification and 9,584 were assigned with Gene Ontology terms. Queries conducted within the database enabled the identification of genes putatively involved in immune and inflammatory reactions which had not been previously evidenced in the vent mussel. Their physical counterpart was confirmed by semi-quantitative quantitative Reverse-Transcription-Polymerase Chain Reactions (RT-PCR and their RNA transcription level by quantitative PCR (q

  2. Comparison of pulseoximetry oxygen saturation and arterial oxygen saturation in open heart intensive care unit

    Directory of Open Access Journals (Sweden)

    Alireza Mahoori

    2013-08-01

    Full Text Available Background: Pulseoximetry is widely used in the critical care setting, currently used to guide therapeutic interventions. Few studies have evaluated the accuracy of SPO2 (puls-eoximetry oxygen saturation in intensive care unit after cardiac surgery. Our objective was to compare pulseoximetry with arterial oxygen saturation (SaO2 during clinical routine in such patients, and to examine the effect of mild acidosis on this relationship.Methods: In an observational prospective study 80 patients were evaluated in intensive care unit after cardiac surgery. SPO2 was recorded and compared with SaO2 obtained by blood gas analysis. One or serial arterial blood gas analyses (ABGs were performed via a radial artery line while a reliable pulseoximeter signal was present. One hundred thirty seven samples were collected and for each blood gas analyses, SaO2 and SPO2 we recorded.Results: O2 saturation as a marker of peripheral perfusion was measured by Pulseoxim-etry (SPO2. The mean difference between arterial oxygen saturation and pulseoximetry oxygen saturation was 0.12%±1.6%. A total of 137 paired readings demonstrated good correlation (r=0.754; P<0.0001 between changes in SPO2 and those in SaO2 in samples with normal hemoglobin. Also in forty seven samples with mild acidosis, paired readings demonstrated good correlation (r=0.799; P<0.0001 and the mean difference between SaO2 and SPO2 was 0.05%±1.5%.Conclusion: Data showed that in patients with stable hemodynamic and good signal quality, changes in pulseoximetry oxygen saturation reliably predict equivalent changes in arterial oxygen saturation. Mild acidosis doesn’t alter the relation between SPO2 and SaO2 to any clinically important extent. In conclusion, the pulse oximeter is useful to monitor oxygen saturation in patients with stable hemodynamic.

  3. Cerebral time domain-NIRS: Reproducibility analysis, optical properties, hemoglobin species and tissue oxygen saturation in a cohort of adult subjects

    OpenAIRE

    Giacalone, Giacomo; Zanoletti, Marta; Contini, Davide; Rebecca, Re; Spinelli, Lorenzo; Roveri, Luisa; Torricelli, Alessandro

    2017-01-01

    The reproducibility of cerebral time-domain near-infrared spectroscopy (TD-NIRS) has not been investigated so far. Besides, reference intervals of cerebral optical properties, of absolute concentrations of deoxygenated-hemoglobin (HbR), oxygenated-hemoglobin (HbO), total hemoglobin (HbT) and tissue oxygen saturation (StO2) and their variability have not been reported. We have addressed these issues on a sample of 88 adult healthy subjects. TD-NIRS measurements at 690, 785, 830 nm were fitted ...

  4. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    International Nuclear Information System (INIS)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-01-01

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform

  5. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  6. Perturbation of longitudinal relaxation rate in rotating frame (PLRF) analysis for quantification of chemical exchange saturation transfer signal in a transient state.

    Science.gov (United States)

    Wang, Yi; Zhang, Yaoyu; Zhao, Xuna; Wu, Bing; Gao, Jia-Hong

    2017-11-01

    To develop a novel analytical method for quantification of chemical exchange saturation transfer (CEST) in the transient state. The proposed method aims to reduce the effects of non-chemical-exchange (non-CE) parameters on the CEST signal, emphasizing the effect of chemical exchange. The difference in the longitudinal relaxation rate in the rotating frame ( ΔR1ρ) was calculated based on perturbation of the Z-value by R1ρ, and a saturation-pulse-amplitude-compensated exchange-dependent relaxation rate (SPACER) was determined with a high-exchange-rate approximation. In both phantom and human subject experiments, MTRasym (representative of the traditional CEST index), ΔR1ρ, and SPACER were measured, evaluated, and compared by altering the non-CE parameters in a transient-state continuous-wave CEST sequence. In line with the theoretical expectation, our experimental data demonstrate that the effects of the non-CE parameters can be more effectively reduced using the proposed indices (  ΔR1ρ and SPACER) than using the traditional CEST index ( MTRasym). The proposed method allows for the chemical exchange weight to be better emphasized in the transient-state CEST signal, which is beneficial, in practice, for quantifying the CEST signal. Magn Reson Med 78:1711-1723, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  7. Quantitative chemical exchange saturation transfer (qCEST) MRI - omega plot analysis of RF-spillover-corrected inverse CEST ratio asymmetry for simultaneous determination of labile proton ratio and exchange rate.

    Science.gov (United States)

    Wu, Renhua; Xiao, Gang; Zhou, Iris Yuwen; Ran, Chongzhao; Sun, Phillip Zhe

    2015-03-01

    Chemical exchange saturation transfer (CEST) MRI is sensitive to labile proton concentration and exchange rate, thus allowing measurement of dilute CEST agent and microenvironmental properties. However, CEST measurement depends not only on the CEST agent properties but also on the experimental conditions. Quantitative CEST (qCEST) analysis has been proposed to address the limitation of the commonly used simplistic CEST-weighted calculation. Recent research has shown that the concomitant direct RF saturation (spillover) effect can be corrected using an inverse CEST ratio calculation. We postulated that a simplified qCEST analysis is feasible with omega plot analysis of the inverse CEST asymmetry calculation. Specifically, simulations showed that the numerically derived labile proton ratio and exchange rate were in good agreement with input values. In addition, the qCEST analysis was confirmed experimentally in a phantom with concurrent variation in CEST agent concentration and pH. Also, we demonstrated that the derived labile proton ratio increased linearly with creatine concentration (P analysis can simultaneously determine labile proton ratio and exchange rate in a relatively complex in vitro CEST system. Copyright © 2015 John Wiley & Sons, Ltd.

  8. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  9. Landsliding in partially saturated materials

    Science.gov (United States)

    Godt, J.W.; Baum, R.L.; Lu, N.

    2009-01-01

    [1] Rainfall-induced landslides are pervasive in hillslope environments around the world and among the most costly and deadly natural hazards. However, capturing their occurrence with scientific instrumentation in a natural setting is extremely rare. The prevailing thinking on landslide initiation, particularly for those landslides that occur under intense precipitation, is that the failure surface is saturated and has positive pore-water pressures acting on it. Most analytic methods used for landslide hazard assessment are based on the above perception and assume that the failure surface is located beneath a water table. By monitoring the pore water and soil suction response to rainfall, we observed shallow landslide occurrence under partially saturated conditions for the first time in a natural setting. We show that the partially saturated shallow landslide at this site is predictable using measured soil suction and water content and a novel unified effective stress concept for partially saturated earth materials. Copyright 2009 by the American Geophysical Union.

  10. RNAi High-Throughput Screening of Single- and Multi-Cell-Type Tumor Spheroids: A Comprehensive Analysis in Two and Three Dimensions.

    Science.gov (United States)

    Fu, Jiaqi; Fernandez, Daniel; Ferrer, Marc; Titus, Steven A; Buehler, Eugen; Lal-Nag, Madhu A

    2017-06-01

    The widespread use of two-dimensional (2D) monolayer cultures for high-throughput screening (HTS) to identify targets in drug discovery has led to attrition in the number of drug targets being validated. Solid tumors are complex, aberrantly growing microenvironments that harness structural components from stroma, nutrients fed through vasculature, and immunosuppressive factors. Increasing evidence of stromally-derived signaling broadens the complexity of our understanding of the tumor microenvironment while stressing the importance of developing better models that reflect these interactions. Three-dimensional (3D) models may be more sensitive to certain gene-silencing events than 2D models because of their components of hypoxia, nutrient gradients, and increased dependence on cell-cell interactions and therefore are more representative of in vivo interactions. Colorectal cancer (CRC) and breast cancer (BC) models composed of epithelial cells only, deemed single-cell-type tumor spheroids (SCTS) and multi-cell-type tumor spheroids (MCTS), containing fibroblasts were developed for RNAi HTS in 384-well microplates with flat-bottom wells for 2D screening and round-bottom, ultra-low-attachment wells for 3D screening. We describe the development of a high-throughput assay platform that can assess physiologically relevant phenotypic differences between screening 2D versus 3D SCTS, 3D SCTS, and MCTS in the context of different cancer subtypes. This assay platform represents a paradigm shift in how we approach drug discovery that can reduce the attrition rate of drugs that enter the clinic.

  11. Reduced dimensionality (3,2)D NMR experiments and their automated analysis: implications to high-throughput structural studies on proteins.

    Science.gov (United States)

    Reddy, Jithender G; Kumar, Dinesh; Hosur, Ramakrishna V

    2015-02-01

    Protein NMR spectroscopy has expanded dramatically over the last decade into a powerful tool for the study of their structure, dynamics, and interactions. The primary requirement for all such investigations is sequence-specific resonance assignment. The demand now is to obtain this information as rapidly as possible and in all types of protein systems, stable/unstable, soluble/insoluble, small/big, structured/unstructured, and so on. In this context, we introduce here two reduced dimensionality experiments – (3,2)D-hNCOcanH and (3,2)D-hNcoCAnH – which enhance the previously described 2D NMR-based assignment methods quite significantly. Both the experiments can be recorded in just about 2-3 h each and hence would be of immense value for high-throughput structural proteomics and drug discovery research. The applicability of the method has been demonstrated using alpha-helical bovine apo calbindin-D9k P43M mutant (75 aa) protein. Automated assignment of this data using AUTOBA has been presented, which enhances the utility of these experiments. The backbone resonance assignments so derived are utilized to estimate secondary structures and the backbone fold using Web-based algorithms. Taken together, we believe that the method and the protocol proposed here can be used for routine high-throughput structural studies of proteins. Copyright © 2014 John Wiley & Sons, Ltd.

  12. Δ isobars and nuclear saturation

    Science.gov (United States)

    Ekström, A.; Hagen, G.; Morris, T. D.; Papenbrock, T.; Schwartz, P. D.

    2018-02-01

    We construct a nuclear interaction in chiral effective field theory with explicit inclusion of the Δ -isobar Δ (1232 ) degree of freedom at all orders up to next-to-next-to-leading order (NNLO). We use pion-nucleon (π N ) low-energy constants (LECs) from a Roy-Steiner analysis of π N scattering data, optimize the LECs in the contact potentials up to NNLO to reproduce low-energy nucleon-nucleon scattering phase shifts, and constrain the three-nucleon interaction at NNLO to reproduce the binding energy and point-proton radius of 4He. For heavier nuclei we use the coupled-cluster method to compute binding energies, radii, and neutron skins. We find that radii and binding energies are much improved for interactions with explicit inclusion of Δ (1232 ) , while Δ -less interactions produce nuclei that are not bound with respect to breakup into α particles. The saturation of nuclear matter is significantly improved, and its symmetry energy is consistent with empirical estimates.

  13. Nitrogen saturation in stream ecosystems.

    Science.gov (United States)

    Earl, Stevan R; Valett, H Maurice; Webster, Jackson R

    2006-12-01

    The concept of nitrogen (N) saturation has organized the assessment of N loading in terrestrial ecosystems. Here we extend the concept to lotic ecosystems by coupling Michaelis-Menten kinetics and nutrient spiraling. We propose a series of saturation response types, which may be used to characterize the proximity of streams to N saturation. We conducted a series of short-term N releases using a tracer (15NO3-N) to measure uptake. Experiments were conducted in streams spanning a gradient of background N concentration. Uptake increased in four of six streams as NO3-N was incrementally elevated, indicating that these streams were not saturated. Uptake generally corresponded to Michaelis-Menten kinetics but deviated from the model in two streams where some other growth-critical factor may have been limiting. Proximity to saturation was correlated to background N concentration but was better predicted by the ratio of dissolved inorganic N (DIN) to soluble reactive phosphorus (SRP), suggesting phosphorus limitation in several high-N streams. Uptake velocity, a reflection of uptake efficiency, declined nonlinearly with increasing N amendment in all streams. At the same time, uptake velocity was highest in the low-N streams. Our conceptual model of N transport, uptake, and uptake efficiency suggests that, while streams may be active sites of N uptake on the landscape, N saturation contributes to nonlinear changes in stream N dynamics that correspond to decreased uptake efficiency.

  14. Intake of saturated and trans unsaturated fatty acids and risk of all cause mortality, cardiovascular disease, and type 2 diabetes: systematic review and meta-analysis of observational studies.

    Science.gov (United States)

    de Souza, Russell J; Mente, Andrew; Maroleanu, Adriana; Cozma, Adrian I; Ha, Vanessa; Kishibe, Teruko; Uleryk, Elizabeth; Budylowski, Patrick; Schünemann, Holger; Beyene, Joseph; Anand, Sonia S

    2015-08-11

    To systematically review associations between intake of saturated fat and trans unsaturated fat and all cause mortality, cardiovascular disease (CVD) and associated mortality, coronary heart disease (CHD) and associated mortality, ischemic stroke, and type 2 diabetes. Systematic review and meta-analysis. Medline, Embase, Cochrane Central Registry of Controlled Trials, Evidence-Based Medicine Reviews, and CINAHL from inception to 1 May 2015, supplemented by bibliographies of retrieved articles and previous reviews. Observational studies reporting associations of saturated fat and/or trans unsaturated fat (total, industrially manufactured, or from ruminant animals) with all cause mortality, CHD/CVD mortality, total CHD, ischemic stroke, or type 2 diabetes. Two reviewers independently extracted data and assessed study risks of bias. Multivariable relative risks were pooled. Heterogeneity was assessed and quantified. Potential publication bias was assessed and subgroup analyses were undertaken. The GRADE approach was used to evaluate quality of evidence and certainty of conclusions. For saturated fat, three to 12 prospective cohort studies for each association were pooled (five to 17 comparisons with 90,501-339,090 participants). Saturated fat intake was not associated with all cause mortality (relative risk 0.99, 95% confidence interval 0.91 to 1.09), CVD mortality (0.97, 0.84 to 1.12), total CHD (1.06, 0.95 to 1.17), ischemic stroke (1.02, 0.90 to 1.15), or type 2 diabetes (0.95, 0.88 to 1.03). There was no convincing lack of association between saturated fat and CHD mortality (1.15, 0.97 to 1.36; P=0.10). For trans fats, one to six prospective cohort studies for each association were pooled (two to seven comparisons with 12,942-230,135 participants). Total trans fat intake was associated with all cause mortality (1.34, 1.16 to 1.56), CHD mortality (1.28, 1.09 to 1.50), and total CHD (1.21, 1.10 to 1.33) but not ischemic stroke (1.07, 0.88 to 1.28) or type 2 diabetes

  15. Development of finite element code for the analysis of coupled thermo-hydro-mechanical behaviors of a saturated-unsaturated medium

    International Nuclear Information System (INIS)

    Ohnishi, Y.; Shibata, H.; Kobsayashi, A.

    1987-01-01

    A model is presented which describes fully coupled thermo-hydro-mechanical behavior of a porous geologic medium. The mathematical formulation for the model utilizes the Biot theory for the consolidation and the energy balance equation. If the medium is in the condition of saturated-unsaturated flow, then the free surfaces are taken into consideration in the model. The model, incorporated in a finite element numerical procedure, was implemented in a two-dimensional computer code. The code was developed under the assumptions that the medium is poro-elastic and in the plane strain condition; that water in the ground does not change its phase; and that heat is transferred by conductive and convective flow. Analytical solutions pertaining to consolidation theory for soils and rocks, thermoelasticity for solids and hydrothermal convection theory provided verification of stress and fluid flow couplings, respectively, in the coupled model. Several types of problems are analyzed

  16. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    Science.gov (United States)

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  17. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    Science.gov (United States)

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  19. A comprehensive analysis of in vitro and in vivo genetic fitness of Pseudomonas aeruginosa using high-throughput sequencing of transposon libraries.

    Directory of Open Access Journals (Sweden)

    David Skurnik

    Full Text Available High-throughput sequencing of transposon (Tn libraries created within entire genomes identifies and quantifies the contribution of individual genes and operons to the fitness of organisms in different environments. We used insertion-sequencing (INSeq to analyze the contribution to fitness of all non-essential genes in the chromosome of Pseudomonas aeruginosa strain PA14 based on a library of ∼300,000 individual Tn insertions. In vitro growth in LB provided a baseline for comparison with the survival of the Tn insertion strains following 6 days of colonization of the murine gastrointestinal tract as well as a comparison with Tn-inserts subsequently able to systemically disseminate to the spleen following induction of neutropenia. Sequencing was performed following DNA extraction from the recovered bacteria, digestion with the MmeI restriction enzyme that hydrolyzes DNA 16 bp away from the end of the Tn insert, and fractionation into oligonucleotides of 1,200-1,500 bp that were prepared for high-throughput sequencing. Changes in frequency of Tn inserts into the P. aeruginosa genome were used to quantify in vivo fitness resulting from loss of a gene. 636 genes had <10 sequencing reads in LB, thus defined as unable to grow in this medium. During in vivo infection there were major losses of strains with Tn inserts in almost all known virulence factors, as well as respiration, energy utilization, ion pumps, nutritional genes and prophages. Many new candidates for virulence factors were also identified. There were consistent changes in the recovery of Tn inserts in genes within most operons and Tn insertions into some genes enhanced in vivo fitness. Strikingly, 90% of the non-essential genes were required for in vivo survival following systemic dissemination during neutropenia. These experiments resulted in the identification of the P. aeruginosa strain PA14 genes necessary for optimal survival in the mucosal and systemic environments of a mammalian

  20. High throughput analysis of red wine and grape phenolics-adaptation and validation of methyl cellulose precipitable tannin assay and modified Somers color assay to a rapid 96 well plate format.

    Science.gov (United States)

    Mercurio, Meagan D; Dambergs, Robert G; Herderich, Markus J; Smith, Paul A

    2007-06-13

    The methyl cellulose precipitable (MCP) tannin assay and a modified version of the Somers and Evans color assay were adapted to high-throughput (HTP) analysis. To improve efficiency of the MCP tannin assay, a miniaturized 1 mL format and a HTP format using 96 well plates were developed. The Somers color assay was modified to allow the standardization of pH and ethanol concentrations of wine samples in a simple one-step dilution with a buffer solution, thus removing inconsistencies between wine matrices prior to analysis and allowing for its adaptation to a HTP format. Validation studies showed that all new formats were efficient, and results were reproducible and analogous to the original formats.

  1. Saturated Zone In-Situ Testing

    International Nuclear Information System (INIS)

    Reimus, P. W.; Umari, M. J.

    2003-01-01

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that have been conducted to test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain. The test interpretations provide estimates of flow and transport parameters that are used in the development of parameter distributions for Total System Performance Assessment (TSPA) calculations. These parameter distributions are documented in the revisions to the SZ flow model report (BSC 2003 [ 162649]), the SZ transport model report (BSC 2003 [ 162419]), the SZ colloid transport report (BSC 2003 [162729]), and the SZ transport model abstraction report (BSC 2003 [1648701]). Specifically, this scientific analysis report provides the following information that contributes to the assessment of the capability of the SZ to serve as a barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvium Testing Complex (ATC), which is located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and

  2. Saturated Zone In-Situ Testing

    Energy Technology Data Exchange (ETDEWEB)

    P. W. Reimus; M. J. Umari

    2003-12-23

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that have been conducted to test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain. The test interpretations provide estimates of flow and transport parameters that are used in the development of parameter distributions for Total System Performance Assessment (TSPA) calculations. These parameter distributions are documented in the revisions to the SZ flow model report (BSC 2003 [ 162649]), the SZ transport model report (BSC 2003 [ 162419]), the SZ colloid transport report (BSC 2003 [162729]), and the SZ transport model abstraction report (BSC 2003 [1648701]). Specifically, this scientific analysis report provides the following information that contributes to the assessment of the capability of the SZ to serve as a barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvium Testing Complex (ATC), which is located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and

  3. Differential effects of saturated fatty acids on the risk of metabolic syndrome: a matched case-control and meta-analysis study.

    Science.gov (United States)

    Yang, Wei-Sin; Chen, Pei-Chun; Hsu, Hsiu-Ching; Su, Ta-Chen; Lin, Hung-Ju; Chen, Ming-Fong; Lee, Yuan-Teh; Chien, Kuo-Liong

    2018-06-01

    We investigated the association between plasma saturated fatty acids (SFAs) and the risk of metabolic syndrome among ethnic Chinese adults in Taiwan who attended a health check-up center. A case-control study based on 1000 cases of metabolic syndrome and 1:1 matched control participants (mean age, 54.9 ± 10.7 y; 36% females) were recruited. Metabolic syndrome was defined according to the criteria of the International Diabetes Federation. Gas chromatography was used to measure the distribution of fatty acids in plasma (% of total fatty acids). Even-chain SFAs, including 14:0, 16:0, and 18:0, were associated with metabolic syndrome; the adjusted odds ratio [OR] and 95% confidence interval [CI] per standard deviation [SD] difference was 3.32, [1.98-5.59]; however, very-long-chain SFAs, including 20:0, 21:0, 22:0, 23:0, and 24:0, were inversely associated with metabolic syndrome. The adjusted OR [95% CI] per SD difference was 0.67 [0.58-0.78]. The area under the receiver operative characteristic curve increased from 0.814 in the basic model to 0.815 (p = 0.54, compared with the basic model), 0.818 (p metabolic syndrome, implying that SFAs are not homogenous for the effects. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Analysis of the saturated hydrocarbon in coal, carbonaceous mudstone and oils from the lower Jurassic coal measures in the Turpan Basin by GC/MS

    International Nuclear Information System (INIS)

    Fang Xuan; Meng Qianxiang; Sun Minzhuo; Du Li; Ding Wanren

    2005-01-01

    Saturated hydrocarbon of coal, carbonaceous mudstone and oils from the Lower Jurassic coal measures in the Turpan basin were studied, and biomarker characteristics and coal thermal maturity analyzed to draw the following conclusions. T here are many similar biomarker characteristics between oil from middle-lower Jurassic of Turpan Basin and coal and carbonaceous mudstone in the same strata. They all contain specific r-lupane, I-norbietane, C 24 -tetracyclic and high content of C 29 -steranes. These characteristics suggest that they have similar matter source of the organic matter derived from matter with abundant high plants. Meanwhile, biomarkers often used to indicate depositional environments characterized by high Pr/Ph ratio, little or no gammacerane and high abundance dibenzofurans, such biomarker distributions are indicative of suboxic and freshwater environment. Although coal and carbonaceous mudstone remain in lower thermal maturity (Ro=0.47-0.53), but C 29 -ββ/(αα+ββ) sterane ratio (0.294-0.489) and bezohopane are detected. Because these ferture are related to bacterial activity, bacterial degradation of organic matter maybe take an important role in coal-derived oil. (authors)

  5. Cerebral time domain-NIRS: reproducibility analysis, optical properties, hemoglobin species and tissue oxygen saturation in a cohort of adult subjects.

    Science.gov (United States)

    Giacalone, Giacomo; Zanoletti, Marta; Contini, Davide; Re, Rebecca; Spinelli, Lorenzo; Roveri, Luisa; Torricelli, Alessandro

    2017-11-01

    The reproducibility of cerebral time-domain near-infrared spectroscopy (TD-NIRS) has not been investigated so far. Besides, reference intervals of cerebral optical properties, of absolute concentrations of deoxygenated-hemoglobin (HbR), oxygenated-hemoglobin (HbO), total hemoglobin (HbT) and tissue oxygen saturation (StO 2 ) and their variability have not been reported. We have addressed these issues on a sample of 88 adult healthy subjects. TD-NIRS measurements at 690, 785, 830 nm were fitted with the diffusion model for semi-infinite homogenous media. Reproducibility, performed on 3 measurements at 5 minutes intervals, ranges from 1.8 to 6.9% for each of the hemoglobin species. The mean ± SD global values of HbR, HbO, HbT, StO 2 are respectively 24 ± 7 μM, 33.3 ± 9.5 μM, 57.4 ± 15.8 μM, 58 ± 4.2%. StO 2 displays the narrowest range of variability across brain regions.

  6. Meta-analysis of field-saturated hydraulic conductivity recovery following wildland fire: Applications for hydrologic model parameterization and resilience assessment

    Science.gov (United States)

    Ebel, Brian A.; Martin, Deborah

    2017-01-01

    Hydrologic recovery after wildfire is critical for restoring the ecosystem services of protecting of human lives and infrastructure from hazards and delivering water supply of sufficient quality and quantity. Recovery of soil-hydraulic properties, such as field-saturated hydraulic conductivity (Kfs), is a key factor for assessing the duration of watershed-scale flash flood and debris flow risks after wildfire. Despite the crucial role of Kfs in parameterizing numerical hydrologic models to predict the magnitude of postwildfire run-off and erosion, existing quantitative relations to predict Kfsrecovery with time since wildfire are lacking. Here, we conduct meta-analyses of 5 datasets from the literature that measure or estimate Kfs with time since wildfire for longer than 3-year duration. The meta-analyses focus on fitting 2 quantitative relations (linear and non-linear logistic) to explain trends in Kfs temporal recovery. The 2 relations adequately described temporal recovery except for 1 site where macropore flow dominated infiltration and Kfs recovery. This work also suggests that Kfs can have low hydrologic resistance (large postfire changes), and moderate to high hydrologic stability (recovery time relative to disturbance recurrence interval) and resilience (recovery of hydrologic function and provision of ecosystem services). Future Kfs relations could more explicitly incorporate processes such as soil-water repellency, ground cover and soil structure regeneration, macropore recovery, and vegetation regrowth.

  7. Proton magnetic resonance studies of 5,6-saturated thymidine derivatives produced by ionizing radiation. Conformational analysis of 6-hydroxylated diastereoisomers

    Energy Technology Data Exchange (ETDEWEB)

    Cadet, J; Ducolomb, R [CEA Centre d' Etudes Nucleaires de Grenoble, 38 (France). Lab. de Radiobiologie; Hruska, F E [Manitoba Univ., Winnipeg (Canada). Dept. of Chemistry

    1979-06-20

    The conformational properties of ten 6-hydroxylated dihydrothymidine derivatives including the various diastereoisomers of 5,6-dihydroxy-5,6-dihydrothymidine, 6-hydroxy-5,6-dihydrothymidine and 5-bromo-6-hydroxy-5,6-dihydrothymidine have been studied by 250 MHz proton magnetic resonance in aqueous solutions. A close correlation has been established between the carbon-6 configuration and the osidic conformation. The increase in the amplitude of the puckering within the furanose ring compared to that of thymidine or 2'-deoxyuridine is more pronounced for the levortatory (6S) nucleosides than for the dextrorotatory (6R) diastereoisomers. The importance of the 2'endo conformer population decreases in the following order: (-)> (+)>thymidine. The absence of destabilizing effects on the g/sup +/ rotameric population about the C(4')-C(5') bond denotes the lack of any interaction between exocyclic hydroxymethyl group and the 6-hydroxyl function or the 2-keto group. The 5,6-saturated nucleosides adopt a preferential anti conformation. The comparison has been extended to syn nucleosides which show opposite trends in the sugar conformation and g/sup +/ distribution.

  8. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  9. Effective stress principle for partially saturated media

    International Nuclear Information System (INIS)

    McTigue, D.F.; Wilson, R.K.; Nunziato, J.W.

    1984-04-01

    In support of the Nevada Nuclear Waste Storage Investigation (NNWSI) Project, we have undertaken a fundamental study of water migration in partially saturated media. One aspect of that study, on which we report here, has been to use the continuum theory of mixtures to extend the classical notion of effective stress to partially saturated media. Our analysis recovers previously proposed phenomenological representations for the effective stress in terms of the capillary pressure. The theory is illustrated by specializing to the case of linear poroelasticity, for which we calculate the deformation due to the fluid pressure in a static capillary fringe. We then examine the transient consolidation associated with liquid flow induced by an applied surface load. Settlement accompanies this flow as the liquid is redistributed by a nonlinear diffusion process. For material properties characteristic of tuff from the Nevada Test Site, these effects are found to be vanishingly small. 14 references, 7 figures, 1 table

  10. The danish tax on saturated fat

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Smed, Sinne

    Denmark introduced a new tax on saturated fat in food products with effect from October 2011. The objective of this paper is to make an effect assessment of this tax for some of the product categories most significantly affected by the new tax, namely fats such as butter, butter-blends, margarine...... on saturated fat in food products has had some effects on the market for the considered products, in that the level of consumption of fats dropped by 10 – 20%. Furthermore, the analysis points at shifts in demand from high-price supermarkets towards low-price discount stores – a shift that seems to have been...... utilized by discount chains to raise the prices of butter and margarine by more than the pure tax increase. Due to the relatively short data period with the tax being active, interpretation of these findings from a long-run perspective should be done with considerable care. It is thus recommended to repeat...

  11. Noise and saturation properties of semiconductor quantum dot optical amplifiers

    DEFF Research Database (Denmark)

    Berg, Tommy Winther; Mørk, Jesper

    2002-01-01

    We present a detailed theoretical analysis of quantum dot optical amplifiers. Due to the presence of a reservoir of wetting layer states, the saturation and noise properties differ markedly from bulk or QW amplifiers and may be significantly improved.......We present a detailed theoretical analysis of quantum dot optical amplifiers. Due to the presence of a reservoir of wetting layer states, the saturation and noise properties differ markedly from bulk or QW amplifiers and may be significantly improved....

  12. Nonlinear Gain Saturation in Active Slow Light Photonic Crystal Waveguides

    DEFF Research Database (Denmark)

    Chen, Yaohui; Mørk, Jesper

    2013-01-01

    We present a quantitative three-dimensional analysis of slow-light enhanced traveling wave amplification in an active semiconductor photonic crystal waveguides. The impact of slow-light propagation on the nonlinear gain saturation of the device is investigated.......We present a quantitative three-dimensional analysis of slow-light enhanced traveling wave amplification in an active semiconductor photonic crystal waveguides. The impact of slow-light propagation on the nonlinear gain saturation of the device is investigated....

  13. Rapid determination of oxygen saturation and vascularity for cancer detection.

    Directory of Open Access Journals (Sweden)

    Fangyao Hu

    Full Text Available A rapid heuristic ratiometric analysis for estimating tissue hemoglobin concentration and oxygen saturation from measured tissue diffuse reflectance spectra is presented. The analysis was validated in tissue-mimicking phantoms and applied to clinical measurements in head and neck, cervical and breast tissues. The analysis works in two steps. First, a linear equation that translates the ratio of the diffuse reflectance at 584 nm and 545 nm to estimate the tissue hemoglobin concentration using a Monte Carlo-based lookup table was developed. This equation is independent of tissue scattering and oxygen saturation. Second, the oxygen saturation was estimated using non-linear logistic equations that translate the ratio of the diffuse reflectance spectra at 539 nm to 545 nm into the tissue oxygen saturation. Correlations coefficients of 0.89 (0.86, 0.77 (0.71 and 0.69 (0.43 were obtained for the tissue hemoglobin concentration (oxygen saturation values extracted using the full spectral Monte Carlo and the ratiometric analysis, for clinical measurements in head and neck, breast and cervical tissues, respectively. The ratiometric analysis was more than 4000 times faster than the inverse Monte Carlo analysis for estimating tissue hemoglobin concentration and oxygen saturation in simulated phantom experiments. In addition, the discriminatory power of the two analyses was similar. These results show the potential of such empirical tools to rapidly estimate tissue hemoglobin in real-time spectral imaging applications.

  14. Integrated analysis of RNA-binding protein complexes using in vitro selection and high-throughput sequencing and sequence specificity landscapes (SEQRS).

    Science.gov (United States)

    Lou, Tzu-Fang; Weidmann, Chase A; Killingsworth, Jordan; Tanaka Hall, Traci M; Goldstrohm, Aaron C; Campbell, Zachary T

    2017-04-15

    RNA-binding proteins (RBPs) collaborate to control virtually every aspect of RNA function. Tremendous progress has been made in the area of global assessment of RBP specificity using next-generation sequencing approaches both in vivo and in vitro. Understanding how protein-protein interactions enable precise combinatorial regulation of RNA remains a significant problem. Addressing this challenge requires tools that can quantitatively determine the specificities of both individual proteins and multimeric complexes in an unbiased and comprehensive way. One approach utilizes in vitro selection, high-throughput sequencing, and sequence-specificity landscapes (SEQRS). We outline a SEQRS experiment focused on obtaining the specificity of a multi-protein complex between Drosophila RBPs Pumilio (Pum) and Nanos (Nos). We discuss the necessary controls in this type of experiment and examine how the resulting data can be complemented with structural and cell-based reporter assays. Additionally, SEQRS data can be integrated with functional genomics data to uncover biological function. Finally, we propose extensions of the technique that will enhance our understanding of multi-protein regulatory complexes assembled onto RNA. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Multicapillary SDS-gel electrophoresis for the analysis of fluorescently labeled mAb preparations: a high throughput quality control process for the production of QuantiPlasma and PlasmaScan mAb libraries.

    Science.gov (United States)

    Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László

    2014-08-01

    Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Identification and Analysis of Red Sea Mangrove (Avicennia marina) microRNAs by High-Throughput Sequencing and Their Association with Stress Responses

    KAUST Repository

    Khraiwesh, Basel; Pugalenthi, Ganesan; Fedoroff, Nina V.

    2013-01-01

    Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt) are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration. © 2013 Khraiwesh et al.

  17. Biphasic Study to Characterize Agricultural Biogas Plants by High-Throughput 16S rRNA Gene Amplicon Sequencing and Microscopic Analysis.

    Science.gov (United States)

    Maus, Irena; Kim, Yong Sung; Wibberg, Daniel; Stolze, Yvonne; Off, Sandra; Antonczyk, Sebastian; Pühler, Alfred; Scherer, Paul; Schlüter, Andreas

    2017-02-28

    Process surveillance within agricultural biogas plants (BGPs) was concurrently studied by high-throughput 16S rRNA gene amplicon sequencing and an optimized quantitative microscopic fingerprinting (QMF) technique. In contrast to 16S rRNA gene amplicons, digitalized microscopy is a rapid and cost-effective method that facilitates enumeration and morphological differentiation of the most significant groups of methanogens regarding their shape and characteristic autofluorescent factor 420. Moreover, the fluorescence signal mirrors cell vitality. In this study, four different BGPs were investigated. The results indicated stable process performance in the mesophilic BGPs and in the thermophilic reactor. Bacterial subcommunity characterization revealed significant differences between the four BGPs. Most remarkably, the genera Defluviitoga and Halocella dominated the thermophilic bacterial subcommunity, whereas members of another taxon, Syntrophaceticus , were found to be abundant in the mesophilic BGP. The domain Archaea was dominated by the genus Methanoculleus in all four BGPs, followed by Methanosaeta in BGP1 and BGP3. In contrast, Methanothermobacter members were highly abundant in the thermophilic BGP4. Furthermore, a high consistency between the sequencing approach and the QMF method was shown, especially for the thermophilic BGP. The differences elucidated that using this biphasic approach for mesophilic BGPs provided novel insights regarding disaggregated single cells of Methanosarcina and Methanosaeta species. Both dominated the archaeal subcommunity and replaced coccoid Methanoculleus members belonging to the same group of Methanomicrobiales that have been frequently observed in similar BGPs. This work demonstrates that combining QMF and 16S rRNA gene amplicon sequencing is a complementary strategy to describe archaeal community structures within biogas processes.

  18. Identification and analysis of red sea mangrove (Avicennia marina microRNAs by high-throughput sequencing and their association with stress responses.

    Directory of Open Access Journals (Sweden)

    Basel Khraiwesh

    Full Text Available Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration.

  19. Identification and Analysis of Red Sea Mangrove (Avicennia marina) microRNAs by High-Throughput Sequencing and Their Association with Stress Responses

    KAUST Repository

    Khraiwesh, Basel

    2013-04-08

    Although RNA silencing has been studied primarily in model plants, advances in high-throughput sequencing technologies have enabled profiling of the small RNA components of many more plant species, providing insights into the ubiquity and conservatism of some miRNA-based regulatory mechanisms. Small RNAs of 20 to 24 nucleotides (nt) are important regulators of gene transcript levels by either transcriptional or by posttranscriptional gene silencing, contributing to genome maintenance and controlling a variety of developmental and physiological processes. Here, we used deep sequencing and molecular methods to create an inventory of the small RNAs in the mangrove species, Avicennia marina. We identified 26 novel mangrove miRNAs and 193 conserved miRNAs belonging to 36 families. We determined that 2 of the novel miRNAs were produced from known miRNA precursors and 4 were likely to be species-specific by the criterion that we found no homologs in other plant species. We used qRT-PCR to analyze the expression of miRNAs and their target genes in different tissue sets and some demonstrated tissue-specific expression. Furthermore, we predicted potential targets of these putative miRNAs based on a sequence homology and experimentally validated through endonucleolytic cleavage assays. Our results suggested that expression profiles of miRNAs and their predicted targets could be useful in exploring the significance of the conservation patterns of plants, particularly in response to abiotic stress. Because of their well-developed abilities in this regard, mangroves and other extremophiles are excellent models for such exploration. © 2013 Khraiwesh et al.

  20. A high-throughput quantitative expression analysis of cancer-related genes in human HepG2 cells in response to limonene, a potential anticancer agent.

    Science.gov (United States)

    Hafidh, Rand R; Hussein, Saba Z; MalAllah, Mohammed Q; Abdulamir, Ahmed S; Abu Bakar, Fatimah

    2017-11-14

    Citrus bioactive compounds, as active anticancer agent, have been under focus by several studies worldwide. However, the underlying genes responsible for the anticancer potential have not been sufficiently highlighted. The current study investigated the gene expression profile of hepatocellular carcinoma, HepG2, cells after treatment with Limonene. The concentration that killed 50% of HepG2 cells was used to elucidate the genetic mechanisms of limonene anticancer activity. The apoptotic induction was detected by flow cytometry and confocal fluorescence microscope. Two of pro-apoptotic events, caspase-3 activation and phosphatidylserine translocation were manifested by confocal fluorescence microscopy. High-throughput real-time PCR was used to profile 1023 cancer-related genes in 16 different gene families related to the cancer development. In comparison to untreated cells, limonene increased the percentage of apoptotic cells up to 89.61%, by flow cytometry, and 48.2% by fluorescence microscopy. There was a significant limonene-driven differential gene expression of HepG2 cells in 15 different gene families. Limonene was shown to significantly (>2log) up-regulate and down-regulate 14 and 59 genes, respectively. The affected gene families, from most to least affected, were apoptosis induction, signal transduction, cancer genes augmentation, alteration in kinases expression, inflammation, DNA damage repair, and cell cycle proteins. The current study reveals that limonene could be a promising, cheap, and effective anticancer compound. The broad spectrum of limonene anticancer activity is interesting for anticancer drug development. Further research is needed to confirm the current findings and to examine the anticancer potential of limonene along with underlying mechanisms on different cell lines. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Saturation of the turbulent dynamo.

    Science.gov (United States)

    Schober, J; Schleicher, D R G; Federrath, C; Bovino, S; Klessen, R S

    2015-08-01

    The origin of strong magnetic fields in the Universe can be explained by amplifying weak seed fields via turbulent motions on small spatial scales and subsequently transporting the magnetic energy to larger scales. This process is known as the turbulent dynamo and depends on the properties of turbulence, i.e., on the hydrodynamical Reynolds number and the compressibility of the gas, and on the magnetic diffusivity. While we know the growth rate of the magnetic energy in the linear regime, the saturation level, i.e., the ratio of magnetic energy to turbulent kinetic energy that can be reached, is not known from analytical calculations. In this paper we present a scale-dependent saturation model based on an effective turbulent resistivity which is determined by the turnover time scale of turbulent eddies and the magnetic energy density. The magnetic resistivity increases compared to the Spitzer value and the effective scale on which the magnetic energy spectrum is at its maximum moves to larger spatial scales. This process ends when the peak reaches a characteristic wave number k☆ which is determined by the critical magnetic Reynolds number. The saturation level of the dynamo also depends on the type of turbulence and differs for the limits of large and small magnetic Prandtl numbers Pm. With our model we find saturation levels between 43.8% and 1.3% for Pm≫1 and between 2.43% and 0.135% for Pm≪1, where the higher values refer to incompressible turbulence and the lower ones to highly compressible turbulence.

  2. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  3. The Danish tax on saturated fat

    DEFF Research Database (Denmark)

    Vallgårda, Signild; Holm, Lotte; Jensen, Jørgen Dejgård

    2015-01-01

    arguments and themes involved in the debates surrounding the introduction and the repeal. SUBJECTS/METHODS: An analysis of parliamentary debates, expert reports and media coverage; key informant interviews; and a review of studies about the effects of the tax on consumer behaviour. RESULTS: A tax......BACKGROUND/OBJECTIVES: Health promoters have repeatedly proposed using economic policy tools, taxes and subsidies, as a means of changing consumer behaviour. As the first country in the world, Denmark introduced a tax on saturated fat in 2011. It was repealed in 2012. In this paper, we present...... indicates that the tax was effective in changing consumer behaviour....

  4. Retinal oxygen saturation before and after glaucoma surgery.

    Science.gov (United States)

    Nitta, Eri; Hirooka, Kazuyuki; Shimazaki, Takeru; Sato, Shino; Ukegawa, Kaori; Nakano, Yuki; Tsujikawa, Akitaka

    2017-08-01

    This study compared retinal vessel oxygen saturation before and after glaucoma surgery. Retinal oxygen saturation in glaucoma patients was measured using a non-invasive spectrophotometric retinal oximeter. Adequate image quality was found in 49 of the 108 consecutive glaucoma patients recruited, with 30 undergoing trabeculectomy, 11 EX-PRESS and eight trabeculotomy. Retinal oxygen saturation measurements in the retinal arterioles and venules were performed at 1 day prior to and at approximately 10 days after surgery. Statistical analysis was performed using a Student's t-test. After glaucoma surgery, intraocular pressure (IOP) decreased from 19.8 ± 7.7 mmHg to 9.0 ± 5.7 mmHg (p glaucoma surgery had an effect on the retinal venous oxygen saturation. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  5. Improving throughput of single-relay DF channel using linear constellation precoding

    KAUST Repository

    Fareed, Muhammad Mehboob

    2014-08-01

    In this letter, we propose a transmission scheme to improve the overall throughput of a cooperative communication system with single decode-and-forward relay. Symbol error rate and throughput analysis of the new scheme are presented to facilitate the performance comparison with the existing decode-and-forward relaying schemes. Simulation results are further provided to corroborate the analytical results. © 2012 IEEE.

  6. Improving throughput of single-relay DF channel using linear constellation precoding

    KAUST Repository

    Fareed, Muhammad Mehboob; Yang, Hongchuan; Alouini, Mohamed-Slim

    2014-01-01

    In this letter, we propose a transmission scheme to improve the overall throughput of a cooperative communication system with single decode-and-forward relay. Symbol error rate and throughput analysis of the new scheme are presented to facilitate the performance comparison with the existing decode-and-forward relaying schemes. Simulation results are further provided to corroborate the analytical results. © 2012 IEEE.

  7. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection met