WorldWideScience

Sample records for saturation throughput analysis

  1. Performance Analysis of Non-saturated IEEE 802.11 DCF Networks

    Science.gov (United States)

    Zhai, Linbo; Zhang, Xiaomin; Xie, Gang

    This letter presents a model with queueing theory to analyze the performance of non-saturated IEEE 802.11 DCF networks. We use the closed queueing network model and derive an approximate representation of throughput which can reveal the relationship between the throughput and the total offered load under finite traffic load conditions. The accuracy of the model is verified by extensive simulations.

  2. Variant-aware saturating mutagenesis using multiple Cas9 nucleases identifies regulatory elements at trait-associated loci.

    Science.gov (United States)

    Canver, Matthew C; Lessard, Samuel; Pinello, Luca; Wu, Yuxuan; Ilboudo, Yann; Stern, Emily N; Needleman, Austen J; Galactéros, Frédéric; Brugnara, Carlo; Kutlar, Abdullah; McKenzie, Colin; Reid, Marvin; Chen, Diane D; Das, Partha Pratim; A Cole, Mitchel; Zeng, Jing; Kurita, Ryo; Nakamura, Yukio; Yuan, Guo-Cheng; Lettre, Guillaume; Bauer, Daniel E; Orkin, Stuart H

    2017-04-01

    Cas9-mediated, high-throughput, saturating in situ mutagenesis permits fine-mapping of function across genomic segments. Disease- and trait-associated variants identified in genome-wide association studies largely cluster at regulatory loci. Here we demonstrate the use of multiple designer nucleases and variant-aware library design to interrogate trait-associated regulatory DNA at high resolution. We developed a computational tool for the creation of saturating-mutagenesis libraries with single or multiple nucleases with incorporation of variants. We applied this methodology to the HBS1L-MYB intergenic region, which is associated with red-blood-cell traits, including fetal hemoglobin levels. This approach identified putative regulatory elements that control MYB expression. Analysis of genomic copy number highlighted potential false-positive regions, thus emphasizing the importance of off-target analysis in the design of saturating-mutagenesis experiments. Together, these data establish a widely applicable high-throughput and high-resolution methodology to identify minimal functional sequences within large disease- and trait-associated regions.

  3. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Hong Kezhu

    2007-01-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  4. Throughput Analysis of Large Wireless Networks with Regular Topologies

    Directory of Open Access Journals (Sweden)

    Kezhu Hong

    2007-04-01

    Full Text Available The throughput of large wireless networks with regular topologies is analyzed under two medium-access control schemes: synchronous array method (SAM and slotted ALOHA. The regular topologies considered are square, hexagon, and triangle. Both nonfading channels and Rayleigh fading channels are examined. Furthermore, both omnidirectional antennas and directional antennas are considered. Our analysis shows that the SAM leads to a much higher network throughput than the slotted ALOHA. The network throughput in this paper is measured in either bits-hops per second per Hertz per node or bits-meters per second per Hertz per node. The exact connection between the two measures is shown for each topology. With these two fundamental units, the network throughput shown in this paper can serve as a reliable benchmark for future works on network throughput of large networks.

  5. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-11-25

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  6. High-throughput Transcriptome analysis, CAGE and beyond

    KAUST Repository

    Kodzius, Rimantas

    2008-01-01

    1. Current research - PhD work on discovery of new allergens - Postdoctoral work on Transcriptional Start Sites a) Tag based technologies allow higher throughput b) CAGE technology to define promoters c) CAGE data analysis to understand Transcription - Wo

  7. Beyond the Natural Proteome: Nondegenerate Saturation Mutagenesis-Methodologies and Advantages.

    Science.gov (United States)

    Ferreira Amaral, M M; Frigotto, L; Hine, A V

    2017-01-01

    Beyond the natural proteome, high-throughput mutagenesis offers the protein engineer an opportunity to "tweak" the wild-type activity of a protein to create a recombinant protein with required attributes. Of the various approaches available, saturation mutagenesis is one of the core techniques employed by protein engineers, and in recent times, nondegenerate saturation mutagenesis is emerging as the approach of choice. This review compares the current methodologies available for conducting nondegenerate saturation mutagenesis with traditional, degenerate saturation and briefly outlines the options available for screening the resulting libraries, to discover a novel protein with the required activity and/or specificity. © 2017 Elsevier Inc. All rights reserved.

  8. IEEE 802.11e (EDCA analysis in the presence of hidden stations

    Directory of Open Access Journals (Sweden)

    Xijie Liu

    2011-07-01

    Full Text Available The key contribution of this paper is the combined analytical analysis of both saturated and non-saturated throughput of IEEE 802.11e networks in the presence of hidden stations. This approach is an extension to earlier works by other authors which provided Markov chain analysis to the IEEE 802.11 family under various assumptions. Our approach also modifies earlier expressions for the probability that a station transmits a packet in a vulnerable period. The numerical results provide the impact of the access categories on the channel throughput. Various throughput results under different mechanisms are presented.

  9. Throughput and Fairness of Collision Avoidance Protocols in Ad Hoc Networks

    National Research Council Canada - National Science Library

    Garcia-Luna-Aceves, J. J; Wang, Yu

    2004-01-01

    .... In Section 1, The authors present an analytical modeling to derive the saturation throughput of these sender-initiated collision avoidance protocols in multi-hop ad hoc networks with nodes randomly...

  10. Microscopic analysis of saturable absorbers: Semiconductor saturable absorber mirrors versus graphene

    Energy Technology Data Exchange (ETDEWEB)

    Hader, J.; Moloney, J. V. [Nonlinear Control Strategies, Inc., 3542 N. Geronimo Ave., Tucson, Arizona 85705 (United States); College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 (United States); Yang, H.-J.; Scheller, M. [College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 (United States); Koch, S. W. [Department of Physics and Materials Sciences Center, Philipps Universität Marburg, Renthof 5, 35032 Marburg (Germany)

    2016-02-07

    Fully microscopic many-body calculations are used to study the influence of strong sub-picosecond pulses on the carrier distributions and corresponding optical response in saturable absorbers used for mode-locking—semiconductor (quantum well) saturable absorber mirrors (SESAMs) and single layer graphene based saturable absorber mirrors (GSAMs). Unlike in GSAMs, the saturation fluence and recovery time in SESAMs show a strong spectral dependence. While the saturation fluence in the SESAM is minimal at the excitonic bandgap, the optimal recovery time and least pulse distortion due to group delay dispersion are found for excitation higher in the first subband. For excitation near the SESAM bandgap, the saturation fluence is about one tenth of that in the GSAM. At energies above the bandgap, the fluences in both systems become similar. A strong dependence of the saturation fluence on the pulse width in both systems is caused by carrier relaxation during the pulse. The recovery time in graphene is found to be about two to four times faster than that in the SESAMs. The occurrence of negative differential transmission in graphene is shown to be caused by dopant related carriers. In SESAMs, a negative differential transmission is found when exciting below the excitonic resonance where excitation induced dephasing leads to an enhancement of the absorption. Comparisons of the simulation data to the experiment show a very good quantitative agreement.

  11. Max-plus algebraic throughput analysis of synchronous dataflow graphs

    NARCIS (Netherlands)

    de Groote, Robert; Kuper, Jan; Broersma, Haitze J.; Smit, Gerardus Johannes Maria

    2012-01-01

    In this paper we present a novel approach to throughput analysis of synchronous dataflow (SDF) graphs. Our approach is based on describing the evolution of actor firing times as a linear time-invariant system in max-plus algebra. Experimental results indicate that our approach is faster than

  12. Urban Saturated Power Load Analysis Based on a Novel Combined Forecasting Model

    Directory of Open Access Journals (Sweden)

    Huiru Zhao

    2015-03-01

    Full Text Available Analysis of urban saturated power loads is helpful to coordinate urban power grid construction and economic social development. There are two different kinds of forecasting models: the logistic curve model focuses on the growth law of the data itself, while the multi-dimensional forecasting model considers several influencing factors as the input variables. To improve forecasting performance, a novel combined forecasting model for saturated power load analysis was proposed in this paper, which combined the above two models. Meanwhile, the weights of these two models in the combined forecasting model were optimized by employing a fruit fly optimization algorithm. Using Hubei Province as the example, the effectiveness of the proposed combined forecasting model was verified, demonstrating a higher forecasting accuracy. The analysis result shows that the power load of Hubei Province will reach saturation in 2039, and the annual maximum power load will reach about 78,630 MW. The results obtained from this proposed hybrid urban saturated power load analysis model can serve as a reference for sustainable development for urban power grids, regional economies, and society at large.

  13. Stochastic analysis of radionuclide migration in saturated-unsaturated soils

    International Nuclear Information System (INIS)

    Kawanishi, Moto

    1988-01-01

    In Japan, LLRW (low level radioactive wastes) generated from nuclear power plants shall be started to store concentrically in the Shimokita site from 1990, and those could be transformed into land disposal if the positive safety is confirmed. Therefore, it is hoped that the safety assessment method shall be successed for the land disposal of LLRW. In this study, a stochastic model to analyze the radionuclide migration in saturated-unsaturated soils was constructed. The principal results are summarized as follows. 1) We presented a generalized idea for the modeling of the radionuclide migration in saturated-unsaturated soils as an advective-dispersion phenomena followed by the decay of radionuclides and those adsorption/desorption in soils. 2) Based on the radionuclide migration model mentioned above, we developed a stochastic analysis model on radionuclide migration in saturated-unsaturated soils. 3) From the comparison between the simulated results and the exact solution on a few simple one-dimensional advective-dispersion problems of radionuclides, the good validity of this model was confirmed. 4) From the comparison between the simulated results by this model and the experimental results of radionuclide migration in a one-dimensional unsaturated soil column with rainfall, the good applicability was shown. 5) As the stochastic model such as this has several advantages that it is easily able to represent the image of physical phenomena and has basically no numerical dissipation, this model should be more applicable to the analysis of the complicated radionuclide migration in saturated-unsaturated soils. (author)

  14. Saturated and unsaturated stability analysis of slope subjected to rainfall infiltration

    Directory of Open Access Journals (Sweden)

    Gofar Nurly

    2017-01-01

    Full Text Available This paper presents results of saturated and unsaturated stability analysis of typical residual slopes subjected to rainfall infiltration corresponds to 50 years rainfall return period. The slope angles considered were 45° and 70°. The saturated stability analyses were carried out for original and critical ground water level commonly considered by practicing engineer. The analyses were conducted using limit equilibrium method. Unsaturated stability analyses used combination of coupled stress–pore-water pressure analysis to evaluate the effect of rainfall infiltration on the deformation and transient pore-water pressure on slope stability. Slope stability analyses were performed at some times during and after rainfall infiltration. Results show that the critical condition for slope made by sandy material was at the end of rainfall while for clayey material was at some specified times after the rainfall ceased. Unsaturated stability analysis on sandy soil gives higher factor of safety because the soil never reached saturation. Transient analysis using unsaturated soil concept could predict more critical condition of delayed failure of slopes made up of clayey soil.

  15. Analysis of CBRP for UDP and TCP Traffic-Classes to measure throughput in MANETs

    Directory of Open Access Journals (Sweden)

    Hardeep Singh Rayait

    2013-01-01

    Full Text Available In this paper, we analyse the throughput of both TCP and UDP traffic classes for cluster based routing protocol for mobile ad hoc network. It uses clustering structure to improve throughput , decrease average end-to-end delay and improve the average packet delivery ratio. We simulate our routing protocol for nodes running the IEEE802.11 MAC for analysis of throughput for both UDP and TCP traffic classes. The application layer protocol used for UDP is CBR and for TCP is FTP.

  16. Performance Analysis of IEEE 802.11 DCF and IEEE 802.11e EDCA in Non-saturation Condition

    Science.gov (United States)

    Kim, Tae Ok; Kim, Kyung Jae; Choi, Bong Dae

    We analyze the MAC performance of the IEEE 802.11 DCF and 802.11e EDCA in non-saturation condition where device does not have packets to transmit sometimes. We assume that a flow is not generated while the previous flow is in service and the number of packets in a flow is geometrically distributed. In this paper, we take into account the feature of non-saturation condition in standards: possibility of transmission performed without preceding backoff procedure for the first packet arriving at the idle station. Our approach is to model a stochastic behavior of one station as a discrete time Markov chain. We obtain four performance measures: normalized channel throughput, average packet HoL (head of line) delay, expected time to complete transmission of a flow and packet loss probability. Our results can be used for admission control to find the optimal number of stations with some constraints on these measures.

  17. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  18. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  19. Saturated and unsaturated stability analysis of slope subjected to rainfall infiltration

    OpenAIRE

    Gofar Nurly; Rahardjo Harianto

    2017-01-01

    This paper presents results of saturated and unsaturated stability analysis of typical residual slopes subjected to rainfall infiltration corresponds to 50 years rainfall return period. The slope angles considered were 45° and 70°. The saturated stability analyses were carried out for original and critical ground water level commonly considered by practicing engineer. The analyses were conducted using limit equilibrium method. Unsaturated stability analyses used combination of coupled stress–...

  20. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi

    2013-01-01

    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  1. Optimizing transformations for automated, high throughput analysis of flow cytometry data.

    Science.gov (United States)

    Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael

    2010-11-04

    In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce

  2. Optimizing transformations for automated, high throughput analysis of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Weng Andrew

    2010-11-01

    Full Text Available Abstract Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter

  3. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  4. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Andrew J. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Capo, Rosemary C. [Univ. of Pittsburgh, PA (United States); Stewart, Brian W. [Univ. of Pittsburgh, PA (United States); Phan, Thai T. [Univ. of Pittsburgh, PA (United States); Jain, Jinesh C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Hakala, Alexandra [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Guthrie, George D. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2016-09-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  5. High-Throughput Method for Strontium Isotope Analysis by Multi-Collector-Inductively Coupled Plasma-Mass Spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Hakala, Jacqueline Alexandra [National Energy Technology Lab. (NETL), Morgantown, WV (United States)

    2016-11-22

    This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.

  6. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  7. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Ozcan, Aydogan, E-mail: ozcan@ucla.edu [Electrical Engineering Department, University of California, Los Angeles, California 90095 (United States); Bioengineering Department, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute (CNSI), University of California, Los Angeles, California 90095 (United States)

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  8. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules.

    Directory of Open Access Journals (Sweden)

    Antonino Ingargiola

    Full Text Available We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions.

  9. Saturated Switching Systems

    CERN Document Server

    Benzaouia, Abdellah

    2012-01-01

    Saturated Switching Systems treats the problem of actuator saturation, inherent in all dynamical systems by using two approaches: positive invariance in which the controller is designed to work within a region of non-saturating linear behaviour; and saturation technique which allows saturation but guarantees asymptotic stability. The results obtained are extended from the linear systems in which they were first developed to switching systems with uncertainties, 2D switching systems, switching systems with Markovian jumping and switching systems of the Takagi-Sugeno type. The text represents a thoroughly referenced distillation of results obtained in this field during the last decade. The selected tool for analysis and design of stabilizing controllers is based on multiple Lyapunov functions and linear matrix inequalities. All the results are illustrated with numerical examples and figures many of them being modelled using MATLAB®. Saturated Switching Systems will be of interest to academic researchers in con...

  10. High-Throughput Analysis of Enzyme Activities

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Guoxin [Iowa State Univ., Ames, IA (United States)

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  11. Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis

    Science.gov (United States)

    Mohamed Ismael, Hawa; Vandyck, George Kobina

    The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.

  12. Meta-Analysis of High-Throughput Datasets Reveals Cellular Responses Following Hemorrhagic Fever Virus Infection

    Directory of Open Access Journals (Sweden)

    Gavin C. Bowick

    2011-05-01

    Full Text Available The continuing use of high-throughput assays to investigate cellular responses to infection is providing a large repository of information. Due to the large number of differentially expressed transcripts, often running into the thousands, the majority of these data have not been thoroughly investigated. Advances in techniques for the downstream analysis of high-throughput datasets are providing additional methods for the generation of additional hypotheses for further investigation. The large number of experimental observations, combined with databases that correlate particular genes and proteins with canonical pathways, functions and diseases, allows for the bioinformatic exploration of functional networks that may be implicated in replication or pathogenesis. Herein, we provide an example of how analysis of published high-throughput datasets of cellular responses to hemorrhagic fever virus infection can generate additional functional data. We describe enrichment of genes involved in metabolism, post-translational modification and cardiac damage; potential roles for specific transcription factors and a conserved involvement of a pathway based around cyclooxygenase-2. We believe that these types of analyses can provide virologists with additional hypotheses for continued investigation.

  13. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  14. Studies of non-isothermal flow in saturated and partially saturated porous media

    International Nuclear Information System (INIS)

    Ho, C.K.; Maki, K.S.; Glass, R.J.

    1993-01-01

    Physical and numerical experiments have been performed to investigate the behavior of nonisothermal flow in two-dimensional saturated and partially saturated porous media. The physical experiments were performed to identify non-isothermal flow fields and temperature distributions in fully saturated, half-saturated, and residually saturated two-dimensional porous media with bottom heating and top cooling. Two counter-rotating liquid-phase convective cells were observed to develop in the saturated regions of all three cases. Gas-phase convection was also evidenced in the unsaturated regions of the partially saturated experiments. TOUGH2 numerical simulations of the saturated case were found to be strongly dependent on the assumed boundary conditions of the physical system. Models including heat losses through the boundaries of the test cell produced temperature and flow fields that were in better agreement with the observed temperature and flow fields than models that assumed insulated boundary conditions. A sensitivity analysis also showed that a reduction of the bulk permeability of the porous media in the numerical simulations depressed the effects of convection, flattening the temperature profiles across the test cell

  15. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis

    Directory of Open Access Journals (Sweden)

    Yushen Du

    2016-11-01

    Full Text Available Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp, we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available.

  16. Gene Expression Analysis of Escherichia Coli Grown in Miniaturized Bioreactor Platforms for High-Throughput Analysis of Growth and genomic Data

    DEFF Research Database (Denmark)

    Boccazzi, P.; Zanzotto, A.; Szita, Nicolas

    2005-01-01

    Combining high-throughput growth physiology and global gene expression data analysis is of significant value for integrating metabolism and genomics. We compared global gene expression using 500 ng of total RNA from Escherichia coli cultures grown in rich or defined minimal media in a miniaturize...... cultures using just 500 ng of total RNA indicate that high-throughput integration of growth physiology and genomics will be possible with novel biochemical platforms and improved detection technologies....

  17. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  18. Improvements and impacts of GRCh38 human reference on high throughput sequencing data analysis.

    Science.gov (United States)

    Guo, Yan; Dai, Yulin; Yu, Hui; Zhao, Shilin; Samuels, David C; Shyr, Yu

    2017-03-01

    Analyses of high throughput sequencing data starts with alignment against a reference genome, which is the foundation for all re-sequencing data analyses. Each new release of the human reference genome has been augmented with improved accuracy and completeness. It is presumed that the latest release of human reference genome, GRCh38 will contribute more to high throughput sequencing data analysis by providing more accuracy. But the amount of improvement has not yet been quantified. We conducted a study to compare the genomic analysis results between the GRCh38 reference and its predecessor GRCh37. Through analyses of alignment, single nucleotide polymorphisms, small insertion/deletions, copy number and structural variants, we show that GRCh38 offers overall more accurate analysis of human sequencing data. More importantly, GRCh38 produced fewer false positive structural variants. In conclusion, GRCh38 is an improvement over GRCh37 not only from the genome assembly aspect, but also yields more reliable genomic analysis results. Copyright © 2017. Published by Elsevier Inc.

  19. Saturated Zone Colloid-Facilitated Transport

    International Nuclear Information System (INIS)

    Wolfsberg, A.; Reimus, P.

    2001-01-01

    The purpose of the Saturated Zone Colloid-Facilitated Transport Analysis and Modeling Report (AMR), as outlined in its Work Direction and Planning Document (CRWMS MandO 1999a), is to provide retardation factors for colloids with irreversibly-attached radionuclides, such as plutonium, in the saturated zone (SZ) between their point of entrance from the unsaturated zone (UZ) and downgradient compliance points. Although it is not exclusive to any particular radionuclide release scenario, this AMR especially addresses those scenarios pertaining to evidence from waste degradation experiments, which indicate that plutonium and perhaps other radionuclides may be irreversibly attached to colloids. This report establishes the requirements and elements of the design of a methodology for calculating colloid transport in the saturated zone at Yucca Mountain. In previous Total Systems Performance Assessment (TSPA) analyses, radionuclide-bearing colloids were assumed to be unretarded in their migration. Field experiments in fractured tuff at Yucca Mountain and in porous media at other sites indicate that colloids may, in fact, experience retardation relative to the mean pore-water velocity, suggesting that contaminants associated with colloids should also experience some retardation. Therefore, this analysis incorporates field data where available and a theoretical framework when site-specific data are not available for estimating plausible ranges of retardation factors in both saturated fractured tuff and saturated alluvium. The distribution of retardation factors for tuff and alluvium are developed in a form consistent with the Performance Assessment (PA) analysis framework for simulating radionuclide transport in the saturated zone. To improve on the work performed so far for the saturated-zone flow and transport modeling, concerted effort has been made in quantifying colloid retardation factors in both fractured tuff and alluvium. The fractured tuff analysis used recent data

  20. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  1. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  2. WormScan: a technique for high-throughput phenotypic analysis of Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mark D Mathew

    Full Text Available BACKGROUND: There are four main phenotypes that are assessed in whole organism studies of Caenorhabditis elegans; mortality, movement, fecundity and size. Procedures have been developed that focus on the digital analysis of some, but not all of these phenotypes and may be limited by expense and limited throughput. We have developed WormScan, an automated image acquisition system that allows quantitative analysis of each of these four phenotypes on standard NGM plates seeded with E. coli. This system is very easy to implement and has the capacity to be used in high-throughput analysis. METHODOLOGY/PRINCIPAL FINDINGS: Our system employs a readily available consumer grade flatbed scanner. The method uses light stimulus from the scanner rather than physical stimulus to induce movement. With two sequential scans it is possible to quantify the induced phototactic response. To demonstrate the utility of the method, we measured the phenotypic response of C. elegans to phosphine gas exposure. We found that stimulation of movement by the light of the scanner was equivalent to physical stimulation for the determination of mortality. WormScan also provided a quantitative assessment of health for the survivors. Habituation from light stimulation of continuous scans was similar to habituation caused by physical stimulus. CONCLUSIONS/SIGNIFICANCE: There are existing systems for the automated phenotypic data collection of C. elegans. The specific advantages of our method over existing systems are high-throughput assessment of a greater range of phenotypic endpoints including determination of mortality and quantification of the mobility of survivors. Our system is also inexpensive and very easy to implement. Even though we have focused on demonstrating the usefulness of WormScan in toxicology, it can be used in a wide range of additional C. elegans studies including lifespan determination, development, pathology and behavior. Moreover, we have even adapted the

  3. Throughput and delay analysis of IEEE 802.15.6-based CSMA/CA protocol.

    Science.gov (United States)

    Ullah, Sana; Chen, Min; Kwak, Kyung Sup

    2012-12-01

    The IEEE 802.15.6 is a new communication standard on Wireless Body Area Network (WBAN) that focuses on a variety of medical, Consumer Electronics (CE) and entertainment applications. In this paper, the throughput and delay performance of the IEEE 802.15.6 is presented. Numerical formulas are derived to determine the maximum throughput and minimum delay limits of the IEEE 802.15.6 for an ideal channel with no transmission errors. These limits are derived for different frequency bands and data rates. Our analysis is validated by extensive simulations using a custom C+ + simulator. Based on analytical and simulation results, useful conclusions are derived for network provisioning and packet size optimization for different applications.

  4. Throughput and Delay Analysis of HARQ with Code Combining over Double Rayleigh Fading Channels

    KAUST Repository

    Chelli, Ali

    2018-01-15

    This paper proposes the use of hybrid automatic repeat request (HARQ) with code combining (HARQ-CC) to offer reliable communications over double Rayleigh channels. The double Rayleigh fading channel is of particular interest to vehicle-to-vehicle communication systems as well as amplify-and-forward relaying and keyhole channels. This work studies the performance of HARQ-CC over double Rayleigh channels from an information theoretic perspective. Analytical approximations are derived for the $\\\\epsilon$-outage capacity, the average number of transmissions, and the throughput of HARQ-CC. Moreover, we evaluate the delay experienced by Poisson arriving packets for HARQ-CC. We provide analytical expressions for the average waiting time, the packets sojourn time, the average consumed power, and the energy efficiency. In our investigation, we take into account the impact of imperfect feedback on different performance metrics. Additionally, we explore the tradeoff between energy efficiency and the throughput. The proposed scheme is shown to maintain the outage probability below a specified threshold $\\\\epsilon$ which ensures the link reliability. Meanwhile, HARQ-CC adapts implicitly the transmission rate to the channel conditions such that the throughput is maximized. Our results demonstrate that HARQ-CC allows improving the achievable communication rate compared to fixed time diversity schemes. To maximize the throughput of HARQ-CC, the rate per HARQ round should be less than the rate required to meet the outage constraint. Our investigation of the performance of HARQ-CC over Rayleigh and double Rayleigh channels shows that double Rayleigh channels have a higher severity of fading and result in a larger degradation of the throughput. Our analysis reveals that HARQ with incremental redundancy (HARQ-IR) achieves a larger throughput compared to HARQ-CC, while HARQ-CC is simpler to implement, has a lower decoding

  5. web cellHTS2: A web-application for the analysis of high-throughput screening data

    Directory of Open Access Journals (Sweden)

    Boutros Michael

    2010-04-01

    Full Text Available Abstract Background The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. Results The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. Conclusions The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.

  6. Gold-coated polydimethylsiloxane microwells for high-throughput electrochemiluminescence analysis of intracellular glucose at single cells.

    Science.gov (United States)

    Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng

    2018-06-04

    In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.

  7. Uplink SDMA with Limited Feedback: Throughput Scaling

    Directory of Open Access Journals (Sweden)

    Jeffrey G. Andrews

    2008-01-01

    Full Text Available Combined space division multiple access (SDMA and scheduling exploit both spatial multiplexing and multiuser diversity, increasing throughput significantly. Both SDMA and scheduling require feedback of multiuser channel sate information (CSI. This paper focuses on uplink SDMA with limited feedback, which refers to efficient techniques for CSI quantization and feedback. To quantify the throughput of uplink SDMA and derive design guidelines, the throughput scaling with system parameters is analyzed. The specific parameters considered include the numbers of users, antennas, and feedback bits. Furthermore, different SNR regimes and beamforming methods are considered. The derived throughput scaling laws are observed to change for different SNR regimes. For instance, the throughput scales logarithmically with the number of users in the high SNR regime but double logarithmically in the low SNR regime. The analysis of throughput scaling suggests guidelines for scheduling in uplink SDMA. For example, to maximize throughput scaling, scheduling should use the criterion of minimum quantization errors for the high SNR regime and maximum channel power for the low SNR regime.

  8. High-Throughput Analysis and Automation for Glycomics Studies.

    Science.gov (United States)

    Shubhakar, Archana; Reiding, Karli R; Gardner, Richard A; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing use of glycomics in Quality by Design studies to help optimize glycan profiles of drugs with a view to improving their clinical performance. Glycomics is also used in comparability studies to ensure consistency of glycosylation both throughout product development and between biosimilars and innovator drugs. In clinical studies there is as well an expanding interest in the use of glycomics-for example in Genome Wide Association Studies-to follow changes in glycosylation patterns of biological tissues and fluids with the progress of certain diseases. These include cancers, neurodegenerative disorders and inflammatory conditions. Despite rising activity in this field, there are significant challenges in performing large scale glycomics studies. The requirement is accurate identification and quantitation of individual glycan structures. However, glycoconjugate samples are often very complex and heterogeneous and contain many diverse branched glycan structures. In this article we cover HTP sample preparation and derivatization methods, sample purification, robotization, optimized glycan profiling by UHPLC, MS and multiplexed CE, as well as hyphenated techniques and automated data analysis tools. Throughout, we summarize the advantages and challenges with each of these technologies. The issues considered include reliability of the methods for glycan identification and quantitation, sample throughput, labor intensity, and affordability for large sample numbers.

  9. Frequency domain performance analysis of marginally stable LTI systems with saturation

    NARCIS (Netherlands)

    Berg, van den R.A.; Pogromski, A.Y.; Rooda, J.E.; Leonov, G.; Nijmeijer, H.; Pogromsky, A.; Fradkov, A.

    2009-01-01

    In this paper we discuss the frequency domain performance analysis of a marginally stable linear time-invariant (LTI) system with saturation in the feedback loop. We present two methods, both based on the notion of convergent systems, that allow to evaluate the performance of this type of systems in

  10. High-throughput gender identification of penguin species using melting curve analysis.

    Science.gov (United States)

    Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei

    2014-04-03

    Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.

  11. Data reduction for a high-throughput neutron activation analysis system

    International Nuclear Information System (INIS)

    Bowman, W.W.

    1979-01-01

    To analyze samples collected as part of a geochemical survey for the National Uranium Resource Evaluation program, Savannah River Laboratory has installed a high-throughput neutron activation analysis system. As part of that system, computer programs have been developed to reduce raw data to elemental concentrations in two steps. Program RAGS reduces gamma-ray spectra to lists of photopeak energies, peak areas, and statistical errors. Program RICHES determines the elemental concentrations from photopeak and delayed-neutron data, detector efficiencies, analysis parameters (neutron flux and activation, decay, and counting times), and spectrometric and cross-section data from libraries. Both programs have been streamlined for on-line operation with a minicomputer, each requiring approx. 64 kbytes of core. 3 tables

  12. Analysis of a Heroin Epidemic Model with Saturated Treatment Function

    Directory of Open Access Journals (Sweden)

    Isaac Mwangi Wangari

    2017-01-01

    Full Text Available A mathematical model is developed that examines how heroin addiction spreads in society. The model is formulated to take into account the treatment of heroin users by incorporating a realistic functional form that “saturates” representing the limited availability of treatment. Bifurcation analysis reveals that the model has an intrinsic backward bifurcation whenever the saturation parameter is larger than a fixed threshold. We are particularly interested in studying the model’s global stability. In the absence of backward bifurcations, Lyapunov functions can often be found and used to prove global stability. However, in the presence of backward bifurcations, such Lyapunov functions may not exist or may be difficult to construct. We make use of the geometric approach to global stability to derive a condition that ensures that the system is globally asymptotically stable. Numerical simulations are also presented to give a more complete representation of the model dynamics. Sensitivity analysis performed by Latin hypercube sampling (LHS suggests that the effective contact rate in the population, the relapse rate of heroin users undergoing treatment, and the extent of saturation of heroin users are mechanisms fuelling heroin epidemic proliferation.

  13. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  14. High or low oxygen saturation and severe retinopathy of prematurity: a meta-analysis.

    Science.gov (United States)

    Chen, Minghua L; Guo, Lei; Smith, Lois E H; Dammann, Christiane E L; Dammann, Olaf

    2010-06-01

    Low oxygen saturation appears to decrease the risk of severe retinopathy of prematurity (ROP) in preterm newborns when administered during the first few weeks after birth. High oxygen saturation seems to reduce the risk at later postmenstrual ages (PMAs). However, previous clinical studies are not conclusive individually. To perform a systematic review and meta-analysis to report the association between severe ROP incidence of premature infants with high or low target oxygen saturation measured by pulse oximetry. Studies were identified through PubMed and Embase literature searches through May 2009 by using the terms "retinopathy of prematurity and oxygen" or "retinopathy of prematurity and oxygen therapy." We selected 10 publications addressing the association between severe ROP and target oxygen saturation measured by pulse oximetry. Using a random-effects model we calculated the summary-effect estimate. We visually inspected funnel plots to examine possible publication bias. Low oxygen saturation (70%-96%) in the first several postnatal weeks was associated with a reduced risk of severe ROP (risk ratio [RR]: 0.48 [95% confidence interval (CI): 0.31-0.75]). High oxygen saturation (94%-99%) at > or = 32 weeks' PMA was associated with a decreased risk for progression to severe ROP (RR: 0.54 [95% CI: 0.35-0.82]). Among preterm infants with a gestational age of large randomized clinical trial with long-term developmental follow-up is warranted to confirm this meta-analytic result.

  15. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  16. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  17. High Throughput Analysis of Photocatalytic Water Purification

    NARCIS (Netherlands)

    Sobral Romao, J.I.; Baiao Barata, David; Habibovic, Pamela; Mul, Guido; Baltrusaitis, Jonas

    2014-01-01

    We present a novel high throughput photocatalyst efficiency assessment method based on 96-well microplates and UV-Vis spectroscopy. We demonstrate the reproducibility of the method using methyl orange (MO) decomposition, and compare kinetic data obtained with those provided in the literature for

  18. Misconceptions in Reporting Oxygen Saturation

    NARCIS (Netherlands)

    Toffaletti, John; Zijlstra, Willem G.

    2007-01-01

    BACKGROUND: We describe some misconceptions that have become common practice in reporting blood gas and cooximetry results. In 1980, oxygen saturation was incorrectly redefined in a report of a new instrument for analysis of hemoglobin (Hb) derivatives. Oxygen saturation (sO(2)) was redefined as the

  19. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  20. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  1. Seismic response analysis of the deep saturated soil deposits in Shanghai

    Science.gov (United States)

    Huang, Yu; Ye, Weimin; Chen, Zhuchang

    2009-01-01

    The quaternary deposits in Shanghai are horizontal soil layers of thickness up to about 280 m in the urban area with an annual groundwater table between 0.5 and 0.7 m from the surface. The characteristics of deep saturated deposits may have important influences upon seismic response of the ground in Shanghai. Based on the Biot theory for porous media, the water-saturated soil deposits are modeled as a two-phase porous system consisting of solid and fluid phases, in this paper. A nonlinear constitutive model for predicting the seismic response of the ground is developed to describe the dynamic characters of the deep-saturated soil deposits in Shanghai. Subsequently, the seismic response of a typical site with 280 m deep soil layers, which is subjected to four base excitations (El Centro, Taft, Sunan, and Tangshan earthquakes), is analyzed in terms of an effective stress-based finite element method with the proposed constitutive model. Special emphasis is given to the computed results of accelerations, excess pore-water pressures, and settlements during the seismic excitations. It has been found that the analysis can capture fundamental aspects of the ground response and produce preliminary results for seismic assessment.

  2. Annotating Protein Functional Residues by Coupling High-Throughput Fitness Profile and Homologous-Structure Analysis.

    Science.gov (United States)

    Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren

    2016-11-01

    Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is

  3. High-throughput analysis for preparation, processing and analysis of TiO2 coatings on steel by chemical solution deposition

    International Nuclear Information System (INIS)

    Cuadrado Gil, Marcos; Van Driessche, Isabel; Van Gils, Sake; Lommens, Petra; Castelein, Pieter; De Buysser, Klaartje

    2012-01-01

    Highlights: ► High-throughput preparation of TiO 2 aqueous precursors. ► Analysis of stability and surface tension. ► Deposition of TiO 2 coatings. - Abstract: A high-throughput preparation, processing and analysis of titania coatings prepared by chemical solution deposition from water-based precursors at low temperature (≈250 °C) on two different types of steel substrates (Aluzinc® and bright annealed) is presented. The use of the high-throughput equipment allows fast preparation of multiple samples saving time, energy and material; and helps to test the scalability of the process. The process itself includes the use of IR curing for aqueous ceramic precursors and possibilities of using UV irradiation before the final sintering step. The IR curing method permits a much faster curing step compared to normal high temperature treatments in traditional convection devices (i.e., tube furnaces). The formulations, also prepared by high-throughput equipment, are found to be stable in the operational pH range of the substrates (6.5–8.5). Titanium alkoxides itself lack stability in pure water-based environments, but the presence of the different organic complexing agents prevents it from hydrolysis and precipitation reactions. The wetting interaction between the substrates and the various formulations is studied by the determination of the surface free energy of the substrates and the polar and dispersive components of the surface tension of the solutions. The mild temperature program used for preparation of the coatings however does not lead to the formation of pure crystalline material, necessary for the desired photocatalytic and super-hydrophilic behavior of these coatings. Nevertheless, some activity can be reported for these amorphous coatings by monitoring the discoloration of methylene blue in water under UV irradiation.

  4. Analysis of an SEIR Epidemic Model with Saturated Incidence and Saturated Treatment Function

    Directory of Open Access Journals (Sweden)

    Jinhong Zhang

    2014-01-01

    Full Text Available The dynamics of SEIR epidemic model with saturated incidence rate and saturated treatment function are explored in this paper. The basic reproduction number that determines disease extinction and disease survival is given. The existing threshold conditions of all kinds of the equilibrium points are obtained. Sufficient conditions are established for the existence of backward bifurcation. The local asymptotical stability of equilibrium is verified by analyzing the eigenvalues and using the Routh-Hurwitz criterion. We also discuss the global asymptotical stability of the endemic equilibrium by autonomous convergence theorem. The study indicates that we should improve the efficiency and enlarge the capacity of the treatment to control the spread of disease. Numerical simulations are presented to support and complement the theoretical findings.

  5. Detection of Static Eccentricity Fault in Saturated Induction Motors by Air-Gap Magnetic Flux Signature Analysis Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    N. Halem

    2013-06-01

    Full Text Available Unfortunately, motor current signature analysis (MCSA cannot detect the small degrees of the purely static eccentricity (SE defects, while the air-gap magnetic flux signature analysis (FSA is applied successfully. The simulation results are obtained by using time stepping finite elements (TSFE method. In order to show the impact of magnetic saturation upon the diagnosis of SE fault, the analysis is carried out for saturated induction motors. The index signatures of static eccentricity fault around fundamental and PSHs are detected successfully for saturated motor.

  6. DETECTION OF STATIC ECCENTRICITY FAULT IN SATURATED INDUCTION MOTORS BY AIR-GAP MAGNETIC FLUX SIGNATURE ANALYSIS USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    N. Halem

    2013-06-01

    Full Text Available Unfortunately, motor current signature analysis (MCSA cannot detect the small degrees of the purely static eccentricity (SE defects, while the air-gap magnetic flux signature analysis (FSA is applied successfully. The simulation results are obtained by using time stepping finite elements (TSFE method. In order to show the impact of magnetic saturation upon the diagnosis of SE fault, the analysis is carried out for saturated induction motors. The index signatures of static eccentricity fault around fundamental and PSHs are detected successfully for saturated motor.

  7. DETECTION OF STATIC ECCENTRICITY FAULT IN SATURATED INDUCTION MOTORS BY AIR-GAP MAGNETIC FLUX SIGNATURE ANALYSIS USING FINITE ELEMENT METHOD

    Directory of Open Access Journals (Sweden)

    N. Halem

    2015-07-01

    Full Text Available Unfortunately, motor current signature analysis (MCSA cannot detect the small degrees of the purely static eccentricity (SE defects, while the air-gap magnetic flux signature analysis (FSA is applied successfully. The simulation results are obtained by using time stepping finite elements (TSFE method. In order to show the impact of magnetic saturation upon the diagnosis of SE fault, the analysis is carried out for saturated induction motors. The index signatures of static eccentricity fault around fundamental and PSHs are detected successfully for saturated motor.

  8. Saturated evanescent-wave absorption of few-layer graphene-covered side-polished single-mode fiber for all-optical switching

    Science.gov (United States)

    Peng, Kaung-Jay; Wu, Chun-Lung; Lin, Yung-Hsiang; Wang, Hwai-Yung; Cheng, Chih-Hsien; Chi, Yu-Chieh; Lin, Gong-Ru

    2018-01-01

    Using the evanescent-wave saturation effect of hydrogen-free low-temperature synthesized few-layer graphene covered on the cladding region of a side-polished single-mode fiber, a blue pump/infrared probe-based all-optical switch is demonstrated with specific wavelength-dependent probe modulation efficiency. Under the illumination of a blue laser diode at 405 nm, the few-layer graphene exhibits cross-gain modulation at different wavelengths covering the C- and L-bands. At a probe power of 0.5 mW, the L-band switching throughput power variant of 16 μW results in a probe modulation depth of 3.2%. Blue shifting the probe wavelength from 1580 to 1520 nm further enlarges the switching throughput power variant to 24 mW and enhances the probe modulation depth to 5%. Enlarging the probe power from 0.5 to 1 mW further enlarges the switching throughput power variant from 25 to 58 μW to promote its probe modulation depth of up to 5.8% at 1520 nm. In contrast, the probe modulation depth degrades from 5.1% to 1.2% as the pumping power reduces from 85 to 24 mW, which is attributed to the saturable absorption of the few-layer graphene-based evanescent-wave absorber. The modulation depth at wavelength of 1550 nm under a probe power of 1 mW increases from 1.2% to 5.1%, as more carriers can be excited when increasing the blue laser power from 24 to 85 mW, whereas it decreases from 5.1% to 3.3% by increasing the input probe power from 1 to 2 mW to show an easier saturated condition at longer wavelength.

  9. Applications of ambient mass spectrometry in high-throughput screening.

    Science.gov (United States)

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  10. HDAT: web-based high-throughput screening data analysis tools

    International Nuclear Information System (INIS)

    Liu, Rong; Hassan, Taimur; Rallo, Robert; Cohen, Yoram

    2013-01-01

    The increasing utilization of high-throughput screening (HTS) in toxicity studies of engineered nano-materials (ENMs) requires tools for rapid and reliable processing and analyses of large HTS datasets. In order to meet this need, a web-based platform for HTS data analyses tools (HDAT) was developed that provides statistical methods suitable for ENM toxicity data. As a publicly available computational nanoinformatics infrastructure, HDAT provides different plate normalization methods, various HTS summarization statistics, self-organizing map (SOM)-based clustering analysis, and visualization of raw and processed data using both heat map and SOM. HDAT has been successfully used in a number of HTS studies of ENM toxicity, thereby enabling analysis of toxicity mechanisms and development of structure–activity relationships for ENM toxicity. The online approach afforded by HDAT should encourage standardization of and future advances in HTS as well as facilitate convenient inter-laboratory comparisons of HTS datasets. (paper)

  11. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  12. Space Charge Saturated Sheath Regime and Electron Temperature Saturation in Hall Thrusters

    International Nuclear Information System (INIS)

    Raitses, Y.; Staack, D.; Smirnov, A.; Fisch, N.J.

    2005-01-01

    Secondary electron emission in Hall thrusters is predicted to lead to space charge saturated wall sheaths resulting in enhanced power losses in the thruster channel. Analysis of experimentally obtained electron-wall collision frequency suggests that the electron temperature saturation, which occurs at high discharge voltages, appears to be caused by a decrease of the Joule heating rather than by the enhancement of the electron energy loss at the walls due to a strong secondary electron emission

  13. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  14. SATURATED ZONE IN-SITU TESTING

    Energy Technology Data Exchange (ETDEWEB)

    P.W. REIMUS

    2004-11-08

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain, Nevada. The test interpretations provide estimates of flow and transport parameters used in the development of parameter distributions for total system performance assessment (TSPA) calculations. These parameter distributions are documented in ''Site-Scale Saturated Zone Flow Model (BSC 2004 [DIRS 170037]), Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]), Saturated Zone Colloid Transport (BSC 2004 [DIRS 170006]), and ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, this scientific analysis contributes the following to the assessment of the capability of the SZ to serve as part of a natural barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvial Testing Complex (ATC) located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass

  15. SATURATED ZONE IN-SITU TESTING

    International Nuclear Information System (INIS)

    REIMUS, P.W.

    2004-01-01

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain, Nevada. The test interpretations provide estimates of flow and transport parameters used in the development of parameter distributions for total system performance assessment (TSPA) calculations. These parameter distributions are documented in ''Site-Scale Saturated Zone Flow Model (BSC 2004 [DIRS 170037]), Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]), Saturated Zone Colloid Transport (BSC 2004 [DIRS 170006]), and ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, this scientific analysis contributes the following to the assessment of the capability of the SZ to serve as part of a natural barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvial Testing Complex (ATC) located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and colloid

  16. High Throughput Neuro-Imaging Informatics

    Directory of Open Access Journals (Sweden)

    Michael I Miller

    2013-12-01

    Full Text Available This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high dimensional neuroinformatic representations index containing O(E3-E4 discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high throughput machine learning methods for supporting (i cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii integration of image and non-image information for diagnosis and prognosis.

  17. Library construction and evaluation for site saturation mutagenesis.

    Science.gov (United States)

    Sullivan, Bradford; Walton, Adam Z; Stewart, Jon D

    2013-06-10

    We developed a method for creating and evaluating site-saturation libraries that consistently yields an average of 27.4±3.0 codons of the 32 possible within a pool of 95 transformants. This was verified by sequencing 95 members from 11 independent libraries within the gene encoding alkene reductase OYE 2.6 from Pichia stipitis. Correct PCR primer design as well as a variety of factors that increase transformation efficiency were critical contributors to the method's overall success. We also developed a quantitative analysis of library quality (Q-values) that defines library degeneracy. Q-values can be calculated from standard fluorescence sequencing data (capillary electropherograms) and the degeneracy predicted from an early stage of library construction (pooled plasmids from the initial transformation) closely matched that observed after ca. 1000 library members were sequenced. Based on this experience, we suggest that this analysis can be a useful guide when applying our optimized protocol to new systems, allowing one to focus only on good-quality libraries and reject substandard libraries at an early stage. This advantage is particularly important when lower-throughput screening techniques such as chiral-phase GC must be employed to identify protein variants with desirable properties, e.g., altered stereoselectivities or when multiple codons are targeted for simultaneous randomization. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Quantitative analysis of treatment process time and throughput capacity for spot scanning proton therapy

    International Nuclear Information System (INIS)

    Suzuki, Kazumichi; Sahoo, Narayan; Zhang, Xiaodong; Poenisch, Falk; Mackin, Dennis S.; Liu, Amy Y.; Wu, Richard; Zhu, X. Ronald; Gillin, Michael T.; Palmer, Matthew B.; Frank, Steven J.; Lee, Andrew K.

    2016-01-01

    Purpose: To determine the patient throughput and the overall efficiency of the spot scanning system by analyzing treatment time, equipment availability, and maximum daily capacity for the current spot scanning port at Proton Therapy Center Houston and to assess the daily throughput capacity for a hypothetical spot scanning proton therapy center. Methods: At their proton therapy center, the authors have been recording in an electronic medical record system all treatment data, including disease site, number of fields, number of fractions, delivered dose, energy, range, number of spots, and number of layers for every treatment field. The authors analyzed delivery system downtimes that had been recorded for every equipment failure and associated incidents. These data were used to evaluate the patient census, patient distribution as a function of the number of fields and total target volume, and equipment clinical availability. The duration of each treatment session from patient walk-in to patient walk-out of the spot scanning treatment room was measured for 64 patients with head and neck, central nervous system, thoracic, and genitourinary cancers. The authors retrieved data for total target volume and the numbers of layers and spots for all fields from treatment plans for a total of 271 patients (including the above 64 patients). A sensitivity analysis of daily throughput capacity was performed by varying seven parameters in a throughput capacity model. Results: The mean monthly equipment clinical availability for the spot scanning port in April 2012–March 2015 was 98.5%. Approximately 1500 patients had received spot scanning proton therapy as of March 2015. The major disease sites treated in September 2012–August 2014 were the genitourinary system (34%), head and neck (30%), central nervous system (21%), and thorax (14%), with other sites accounting for the remaining 1%. Spot scanning beam delivery time increased with total target volume and accounted for

  19. MIPHENO: Data normalization for high throughput metabolic analysis.

    Science.gov (United States)

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  20. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  1. High-Throughput Analysis and Automation for Glycomics Studies

    NARCIS (Netherlands)

    Shubhakar, A.; Reiding, K.R.; Gardner, R.A.; Spencer, D.I.R.; Fernandes, D.L.; Wuhrer, M.

    2015-01-01

    This review covers advances in analytical technologies for high-throughput (HTP) glycomics. Our focus is on structural studies of glycoprotein glycosylation to support biopharmaceutical realization and the discovery of glycan biomarkers for human disease. For biopharmaceuticals, there is increasing

  2. A novel quantitative approach for eliminating sample-to-sample variation using a hue saturation value analysis program.

    Science.gov (United States)

    Yabusaki, Katsumi; Faits, Tyler; McMullen, Eri; Figueiredo, Jose Luiz; Aikawa, Masanori; Aikawa, Elena

    2014-01-01

    As computing technology and image analysis techniques have advanced, the practice of histology has grown from a purely qualitative method to one that is highly quantified. Current image analysis software is imprecise and prone to wide variation due to common artifacts and histological limitations. In order to minimize the impact of these artifacts, a more robust method for quantitative image analysis is required. Here we present a novel image analysis software, based on the hue saturation value color space, to be applied to a wide variety of histological stains and tissue types. By using hue, saturation, and value variables instead of the more common red, green, and blue variables, our software offers some distinct advantages over other commercially available programs. We tested the program by analyzing several common histological stains, performed on tissue sections that ranged from 4 µm to 10 µm in thickness, using both a red green blue color space and a hue saturation value color space. We demonstrated that our new software is a simple method for quantitative analysis of histological sections, which is highly robust to variations in section thickness, sectioning artifacts, and stain quality, eliminating sample-to-sample variation.

  3. High-throughput metagenomic technologies for complex microbial community analysis: open and closed formats.

    Science.gov (United States)

    Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa

    2015-01-27

    Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.

  4. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  5. Targeted DNA Methylation Analysis by High Throughput Sequencing in Porcine Peri-attachment Embryos

    OpenAIRE

    MORRILL, Benson H.; COX, Lindsay; WARD, Anika; HEYWOOD, Sierra; PRATHER, Randall S.; ISOM, S. Clay

    2013-01-01

    Abstract The purpose of this experiment was to implement and evaluate the effectiveness of a next-generation sequencing-based method for DNA methylation analysis in porcine embryonic samples. Fourteen discrete genomic regions were amplified by PCR using bisulfite-converted genomic DNA derived from day 14 in vivo-derived (IVV) and parthenogenetic (PA) porcine embryos as template DNA. Resulting PCR products were subjected to high-throughput sequencing using the Illumina Genome Analyzer IIx plat...

  6. Accurate CpG and non-CpG cytosine methylation analysis by high-throughput locus-specific pyrosequencing in plants.

    Science.gov (United States)

    How-Kit, Alexandre; Daunay, Antoine; Mazaleyrat, Nicolas; Busato, Florence; Daviaud, Christian; Teyssier, Emeline; Deleuze, Jean-François; Gallusci, Philippe; Tost, Jörg

    2015-07-01

    Pyrosequencing permits accurate quantification of DNA methylation of specific regions where the proportions of the C/T polymorphism induced by sodium bisulfite treatment of DNA reflects the DNA methylation level. The commercially available high-throughput locus-specific pyrosequencing instruments allow for the simultaneous analysis of 96 samples, but restrict the DNA methylation analysis to CpG dinucleotide sites, which can be limiting in many biological systems. In contrast to mammals where DNA methylation occurs nearly exclusively on CpG dinucleotides, plants genomes harbor DNA methylation also in other sequence contexts including CHG and CHH motives, which cannot be evaluated by these pyrosequencing instruments due to software limitations. Here, we present a complete pipeline for accurate CpG and non-CpG cytosine methylation analysis at single base-resolution using high-throughput locus-specific pyrosequencing. The devised approach includes the design and validation of PCR amplification on bisulfite-treated DNA and pyrosequencing assays as well as the quantification of the methylation level at every cytosine from the raw peak intensities of the Pyrograms by two newly developed Visual Basic Applications. Our method presents accurate and reproducible results as exemplified by the cytosine methylation analysis of the promoter regions of two Tomato genes (NOR and CNR) encoding transcription regulators of fruit ripening during different stages of fruit development. Our results confirmed a significant and temporally coordinated loss of DNA methylation on specific cytosines during the early stages of fruit development in both promoters as previously shown by WGBS. The manuscript describes thus the first high-throughput locus-specific DNA methylation analysis in plants using pyrosequencing.

  7. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Theoretical analysis of saturation and limit cycles in short pulse FEL oscillators

    Energy Technology Data Exchange (ETDEWEB)

    Piovella, N.; Chaix, P.; Jaroszynski, D. [Commissariat a l`Energie Atomique, Bruyeres-le-Chatel (France)] [and others

    1995-12-31

    We derive a model for the non linear evolution of a short pulse oscillator from low signal up to saturation in the small gain regime. This system is controlled by only two independent parameters: cavity detuning and losses. Using a closure relation, this model reduces to a closed set of 5 non linear partial differential equations for the EM field and moments of the electron distribution. An analysis of the linearised system allows to define and calculate the eigenmodes characterising the small signal regime. An arbitrary solution of the complete nonlinear system can then be expanded in terms of these eigenmodes. This allows interpreting various observed nonlinear behaviours, including steady state saturation, limit cycles, and transition to chaos. The single mode approximation reduces to a Landau-Ginzburg equation. It allows to obtain gain, nonlinear frequency shift, and efficiency as functions of cavity detuning and cavity losses. A generalisation to two modes allows to obtain a simple description of the limit cycle behaviour, as a competition between these two modes. An analysis of the transitions to more complex dynamics is also given. Finally, the analytical results are compared to the experimental data from the FELIX experiment.

  9. Protected Turning Movements of Noncooperative Automated Vehicles: Geometrics, Trajectories, and Saturation Flow

    Directory of Open Access Journals (Sweden)

    Xiaobo Liu

    2018-01-01

    Full Text Available This study is the first to quantify throughput (saturation flow of noncooperative automated vehicles when performing turning maneuvers, which are critical bottlenecks in arterial road networks. We first develop a constrained optimization problem based on AVs’ kinematic behavior during a protected signal phase which considers both ABS-enabled and wheels-locked braking, as well as avoiding encroaching into oncoming traffic or past the edge-of-receiving-lane. We analyze noncooperative (“defensive” behavior, in keeping with the Assured Clear Distance Ahead legal standard to which human drivers are held and AVs will likely also be for the foreseeable future. We demonstrate that, under plausible behavioral parameters, AVs appear likely to have positive impacts on throughput of turning traffic streams at intersections, in the range of +0.2% (under the most conservative circumstances to +43% for a typical turning maneuver. We demonstrate that the primary mechanism of impact of turning radius is its effect on speed, which is likely to be constrained by passenger comfort. We show heterogeneous per-lane throughput in the case of “double turn lanes.” Finally, we demonstrate limited sensitivity to crash-risk criterion, with a 4% difference arising from a change from 1 in 10,000 to 1 in 100,000,000. The paper concludes with a brief discussion of policy implications and future research needs.

  10. GUItars: a GUI tool for analysis of high-throughput RNA interference screening data.

    Directory of Open Access Journals (Sweden)

    Asli N Goktug

    Full Text Available High-throughput RNA interference (RNAi screening has become a widely used approach to elucidating gene functions. However, analysis and annotation of large data sets generated from these screens has been a challenge for researchers without a programming background. Over the years, numerous data analysis methods were produced for plate quality control and hit selection and implemented by a few open-access software packages. Recently, strictly standardized mean difference (SSMD has become a widely used method for RNAi screening analysis mainly due to its better control of false negative and false positive rates and its ability to quantify RNAi effects with a statistical basis. We have developed GUItars to enable researchers without a programming background to use SSMD as both a plate quality and a hit selection metric to analyze large data sets.The software is accompanied by an intuitive graphical user interface for easy and rapid analysis workflow. SSMD analysis methods have been provided to the users along with traditionally-used z-score, normalized percent activity, and t-test methods for hit selection. GUItars is capable of analyzing large-scale data sets from screens with or without replicates. The software is designed to automatically generate and save numerous graphical outputs known to be among the most informative high-throughput data visualization tools capturing plate-wise and screen-wise performances. Graphical outputs are also written in HTML format for easy access, and a comprehensive summary of screening results is written into tab-delimited output files.With GUItars, we demonstrated robust SSMD-based analysis workflow on a 3840-gene small interfering RNA (siRNA library and identified 200 siRNAs that increased and 150 siRNAs that decreased the assay activities with moderate to stronger effects. GUItars enables rapid analysis and illustration of data from large- or small-scale RNAi screens using SSMD and other traditional analysis

  11. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    Science.gov (United States)

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  12. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992

  13. The Open Connectome Project Data Cluster: Scalable Analysis and Vision for High-Throughput Neuroscience.

    Science.gov (United States)

    Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob

    2013-01-01

    We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.

  14. Rapid determination of oxygen saturation and vascularity for cancer detection.

    Directory of Open Access Journals (Sweden)

    Fangyao Hu

    Full Text Available A rapid heuristic ratiometric analysis for estimating tissue hemoglobin concentration and oxygen saturation from measured tissue diffuse reflectance spectra is presented. The analysis was validated in tissue-mimicking phantoms and applied to clinical measurements in head and neck, cervical and breast tissues. The analysis works in two steps. First, a linear equation that translates the ratio of the diffuse reflectance at 584 nm and 545 nm to estimate the tissue hemoglobin concentration using a Monte Carlo-based lookup table was developed. This equation is independent of tissue scattering and oxygen saturation. Second, the oxygen saturation was estimated using non-linear logistic equations that translate the ratio of the diffuse reflectance spectra at 539 nm to 545 nm into the tissue oxygen saturation. Correlations coefficients of 0.89 (0.86, 0.77 (0.71 and 0.69 (0.43 were obtained for the tissue hemoglobin concentration (oxygen saturation values extracted using the full spectral Monte Carlo and the ratiometric analysis, for clinical measurements in head and neck, breast and cervical tissues, respectively. The ratiometric analysis was more than 4000 times faster than the inverse Monte Carlo analysis for estimating tissue hemoglobin concentration and oxygen saturation in simulated phantom experiments. In addition, the discriminatory power of the two analyses was similar. These results show the potential of such empirical tools to rapidly estimate tissue hemoglobin in real-time spectral imaging applications.

  15. Comparison of pulseoximetry oxygen saturation and arterial oxygen saturation in open heart intensive care unit

    Directory of Open Access Journals (Sweden)

    Alireza Mahoori

    2013-08-01

    Full Text Available Background: Pulseoximetry is widely used in the critical care setting, currently used to guide therapeutic interventions. Few studies have evaluated the accuracy of SPO2 (puls-eoximetry oxygen saturation in intensive care unit after cardiac surgery. Our objective was to compare pulseoximetry with arterial oxygen saturation (SaO2 during clinical routine in such patients, and to examine the effect of mild acidosis on this relationship.Methods: In an observational prospective study 80 patients were evaluated in intensive care unit after cardiac surgery. SPO2 was recorded and compared with SaO2 obtained by blood gas analysis. One or serial arterial blood gas analyses (ABGs were performed via a radial artery line while a reliable pulseoximeter signal was present. One hundred thirty seven samples were collected and for each blood gas analyses, SaO2 and SPO2 we recorded.Results: O2 saturation as a marker of peripheral perfusion was measured by Pulseoxim-etry (SPO2. The mean difference between arterial oxygen saturation and pulseoximetry oxygen saturation was 0.12%±1.6%. A total of 137 paired readings demonstrated good correlation (r=0.754; P<0.0001 between changes in SPO2 and those in SaO2 in samples with normal hemoglobin. Also in forty seven samples with mild acidosis, paired readings demonstrated good correlation (r=0.799; P<0.0001 and the mean difference between SaO2 and SPO2 was 0.05%±1.5%.Conclusion: Data showed that in patients with stable hemodynamic and good signal quality, changes in pulseoximetry oxygen saturation reliably predict equivalent changes in arterial oxygen saturation. Mild acidosis doesn’t alter the relation between SPO2 and SaO2 to any clinically important extent. In conclusion, the pulse oximeter is useful to monitor oxygen saturation in patients with stable hemodynamic.

  16. Integrative Analysis of High-throughput Cancer Studies with Contrasted Penalization

    Science.gov (United States)

    Shi, Xingjie; Liu, Jin; Huang, Jian; Zhou, Yong; Shia, BenChang; Ma, Shuangge

    2015-01-01

    In cancer studies with high-throughput genetic and genomic measurements, integrative analysis provides a way to effectively pool and analyze heterogeneous raw data from multiple independent studies and outperforms “classic” meta-analysis and single-dataset analysis. When marker selection is of interest, the genetic basis of multiple datasets can be described using the homogeneity model or the heterogeneity model. In this study, we consider marker selection under the heterogeneity model, which includes the homogeneity model as a special case and can be more flexible. Penalization methods have been developed in the literature for marker selection. This study advances from the published ones by introducing the contrast penalties, which can accommodate the within- and across-dataset structures of covariates/regression coefficients and, by doing so, further improve marker selection performance. Specifically, we develop a penalization method that accommodates the across-dataset structures by smoothing over regression coefficients. An effective iterative algorithm, which calls an inner coordinate descent iteration, is developed. Simulation shows that the proposed method outperforms the benchmark with more accurate marker identification. The analysis of breast cancer and lung cancer prognosis studies with gene expression measurements shows that the proposed method identifies genes different from those using the benchmark and has better prediction performance. PMID:24395534

  17. Effectiveness of a high-throughput genetic analysis in the identification of responders/non-responders to CYP2D6-metabolized drugs.

    Science.gov (United States)

    Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A

    2011-01-01

    Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.

  18. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  19. On Throughput Improvement of Wireless Ad Hoc Networks with Hidden Nodes

    Science.gov (United States)

    Choi, Hong-Seok; Lim, Jong-Tae

    In this letter, we present the throughput analysis of the wireless ad hoc networks based on the IEEE 802.11 MAC (Medium Access Control). Especially, our analysis includes the case with the hidden node problem so that it can be applied to the multi-hop networks. In addition, we suggest a new channel access control algorithm to maximize the network throughput and show the usefulness of the proposed algorithm through simulations.

  20. High-throughput phenotyping allows for QTL analysis of defense, symbiosis and development-related traits

    DEFF Research Database (Denmark)

    Hansen, Nina Eberhardtsen

    -throughput phenotyping of whole plants. Additionally, a system for automated confocal microscopy aiming at automated detection of infection thread formation as well as detection of lateral root and nodule primordia is being developed. The objective was to use both systems in genome wide association studies and mutant...... the analysis. Additional phenotyping of defense mutants revealed that MLO, which confers susceptibility towards Blumeria graminis in barley, is also a prime candidate for a S. trifoliorum susceptibility gene in Lotus....

  1. Research on combination forecast of port cargo throughput based on time series and causality analysis

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2013-03-01

    Full Text Available Purpose: The purpose of this paper is to develop a combined model composed of grey-forecast model and Logistic-growth-curve model to improve the accuracy of forecast model of cargo throughput for the port. The authors also use the existing data of a current port to verify the validity of the combined model.Design/methodology/approach: A literature review is undertaken to find the appropriate forecast model of cargo throughput for the port. Through researching the related forecast model, the authors put together the individual models which are significant to study further. Finally, the authors combine two individual models (grey-forecast model and Logistic-growth-curve model into one combined model to forecast the port cargo throughput, and use the model to a physical port in China to testify the validity of the model.Findings: Test by the perceptional data of cargo throughput in the physical port, the results show that the combined model can obtain relatively higher forecast accuracy when it is not easy to find more information. Furthermore, the forecast made by the combined model are more accurate than any of the individual ones.Research limitations/implications: The study provided a new combined forecast model of cargo throughput with a relatively less information to improve the accuracy rate of the forecast. The limitation of the model is that it requires the cargo throughput of the port have an S-shaped change trend.Practical implications: This model is not limited by external conditions such as geographical, cultural. This model predicted the port cargo throughput of one real port in China in 2015, which provided some instructive guidance for the port development.Originality/value: This is the one of the study to improve the accuracy rate of the cargo throughput forecast with little information.

  2. Worst-case Throughput Analysis for Parametric Rate and Parametric Actor Execution Time Scenario-Aware Dataflow Graphs

    Directory of Open Access Journals (Sweden)

    Mladen Skelin

    2014-03-01

    Full Text Available Scenario-aware dataflow (SADF is a prominent tool for modeling and analysis of dynamic embedded dataflow applications. In SADF the application is represented as a finite collection of synchronous dataflow (SDF graphs, each of which represents one possible application behaviour or scenario. A finite state machine (FSM specifies the possible orders of scenario occurrences. The SADF model renders the tightest possible performance guarantees, but is limited by its finiteness. This means that from a practical point of view, it can only handle dynamic dataflow applications that are characterized by a reasonably sized set of possible behaviours or scenarios. In this paper we remove this limitation for a class of SADF graphs by means of SADF model parametrization in terms of graph port rates and actor execution times. First, we formally define the semantics of the model relevant for throughput analysis based on (max,+ linear system theory and (max,+ automata. Second, by generalizing some of the existing results, we give the algorithms for worst-case throughput analysis of parametric rate and parametric actor execution time acyclic SADF graphs with a fully connected, possibly infinite state transition system. Third, we demonstrate our approach on a few realistic applications from digital signal processing (DSP domain mapped onto an embedded multi-processor architecture.

  3. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  4. High-throughput scoring of seed germination

    NARCIS (Netherlands)

    Ligterink, Wilco; Hilhorst, Henk W.M.

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very

  5. In-field High Throughput Phenotyping and Cotton Plant Growth Analysis Using LiDAR.

    Science.gov (United States)

    Sun, Shangpeng; Li, Changying; Paterson, Andrew H; Jiang, Yu; Xu, Rui; Robertson, Jon S; Snider, John L; Chee, Peng W

    2018-01-01

    Plant breeding programs and a wide range of plant science applications would greatly benefit from the development of in-field high throughput phenotyping technologies. In this study, a terrestrial LiDAR-based high throughput phenotyping system was developed. A 2D LiDAR was applied to scan plants from overhead in the field, and an RTK-GPS was used to provide spatial coordinates. Precise 3D models of scanned plants were reconstructed based on the LiDAR and RTK-GPS data. The ground plane of the 3D model was separated by RANSAC algorithm and a Euclidean clustering algorithm was applied to remove noise generated by weeds. After that, clean 3D surface models of cotton plants were obtained, from which three plot-level morphologic traits including canopy height, projected canopy area, and plant volume were derived. Canopy height ranging from 85th percentile to the maximum height were computed based on the histogram of the z coordinate for all measured points; projected canopy area was derived by projecting all points on a ground plane; and a Trapezoidal rule based algorithm was proposed to estimate plant volume. Results of validation experiments showed good agreement between LiDAR measurements and manual measurements for maximum canopy height, projected canopy area, and plant volume, with R 2 -values of 0.97, 0.97, and 0.98, respectively. The developed system was used to scan the whole field repeatedly over the period from 43 to 109 days after planting. Growth trends and growth rate curves for all three derived morphologic traits were established over the monitoring period for each cultivar. Overall, four different cultivars showed similar growth trends and growth rate patterns. Each cultivar continued to grow until ~88 days after planting, and from then on varied little. However, the actual values were cultivar specific. Correlation analysis between morphologic traits and final yield was conducted over the monitoring period. When considering each cultivar individually

  6. Saturated Zone In-Situ Testing

    International Nuclear Information System (INIS)

    Reimus, P. W.; Umari, M. J.

    2003-01-01

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that have been conducted to test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain. The test interpretations provide estimates of flow and transport parameters that are used in the development of parameter distributions for Total System Performance Assessment (TSPA) calculations. These parameter distributions are documented in the revisions to the SZ flow model report (BSC 2003 [ 162649]), the SZ transport model report (BSC 2003 [ 162419]), the SZ colloid transport report (BSC 2003 [162729]), and the SZ transport model abstraction report (BSC 2003 [1648701]). Specifically, this scientific analysis report provides the following information that contributes to the assessment of the capability of the SZ to serve as a barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvium Testing Complex (ATC), which is located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and

  7. Saturated Zone In-Situ Testing

    Energy Technology Data Exchange (ETDEWEB)

    P. W. Reimus; M. J. Umari

    2003-12-23

    The purpose of this scientific analysis is to document the results and interpretations of field experiments that have been conducted to test and validate conceptual flow and radionuclide transport models in the saturated zone (SZ) near Yucca Mountain. The test interpretations provide estimates of flow and transport parameters that are used in the development of parameter distributions for Total System Performance Assessment (TSPA) calculations. These parameter distributions are documented in the revisions to the SZ flow model report (BSC 2003 [ 162649]), the SZ transport model report (BSC 2003 [ 162419]), the SZ colloid transport report (BSC 2003 [162729]), and the SZ transport model abstraction report (BSC 2003 [1648701]). Specifically, this scientific analysis report provides the following information that contributes to the assessment of the capability of the SZ to serve as a barrier for waste isolation for the Yucca Mountain repository system: (1) The bases for selection of conceptual flow and transport models in the saturated volcanics and the saturated alluvium located near Yucca Mountain. (2) Results and interpretations of hydraulic and tracer tests conducted in saturated fractured volcanics at the C-wells complex near Yucca Mountain. The test interpretations include estimates of hydraulic conductivities, anisotropy in hydraulic conductivity, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, matrix diffusion coefficients, fracture apertures, and colloid transport parameters. (3) Results and interpretations of hydraulic and tracer tests conducted in saturated alluvium at the Alluvium Testing Complex (ATC), which is located at the southwestern corner of the Nevada Test Site (NTS). The test interpretations include estimates of hydraulic conductivities, storativities, total porosities, effective porosities, longitudinal dispersivities, matrix diffusion mass transfer coefficients, and

  8. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Craig A Gedye

    Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell

  9. Cell surface profiling using high-throughput flow cytometry: a platform for biomarker discovery and analysis of cellular heterogeneity.

    Science.gov (United States)

    Gedye, Craig A; Hussain, Ali; Paterson, Joshua; Smrke, Alannah; Saini, Harleen; Sirskyj, Danylo; Pereira, Keira; Lobo, Nazleen; Stewart, Jocelyn; Go, Christopher; Ho, Jenny; Medrano, Mauricio; Hyatt, Elzbieta; Yuan, Julie; Lauriault, Stevan; Meyer, Mona; Kondratyev, Maria; van den Beucken, Twan; Jewett, Michael; Dirks, Peter; Guidos, Cynthia J; Danska, Jayne; Wang, Jean; Wouters, Bradly; Neel, Benjamin; Rottapel, Robert; Ailles, Laurie E

    2014-01-01

    Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC) platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell markers.

  10. Throughput performance analysis of multirate, multiclass S-ALOHA OFFH-CDMA packet networks

    DEFF Research Database (Denmark)

    Raddo, Thiago R.; Sanches, Anderson L.; Borges, Ben Hur V

    2015-01-01

    In this paper, we propose a new throughput expression for multirate, multiclass slotted-ALOHA optical fast frequency hopping code-division multiple-access (OFFH-CDMA) packet networks considering a Poisson distribution for packet composite arrivals. We analyze the packet throughput performance...... of a three-class OFFH-CDMA network, where multirate transmissions are achieved via manipulation of the user's code parameters. It is shown that users transmitting at low rates interfere considerably in the performance of high rate users. Finally, we perform a validation procedure to demonstrate...

  11. Use of azeotropic distillation for isotopic analysis of deuterium in soil water and saturate saline solution

    International Nuclear Information System (INIS)

    Santos, Antonio Vieira dos.

    1995-05-01

    The azeotropic distillation technique was adapted to extract soil water and saturate saline solution, which is similar to the sea water for the Isotopic Determination of Deuterium (D). A soil test was used to determine the precision and the nature of the methodology to extract soil water for stable isotopic analysis, using the azeotropic distillation and comparing with traditional methodology of heating under vacuum. This methodology has been very useful for several kinds of soil or saturate saline solution. The apparatus does not have a memory effect, and the chemical reagents do not affect the isotopic composition of soil water. (author). 43 refs., 10 figs., 12 tabs

  12. High-Throughput Fabrication of Nanocone Substrates through Polymer Injection Moulding For SERS Analysis in Microfluidic Systems

    DEFF Research Database (Denmark)

    Viehrig, Marlitt; Matteucci, Marco; Thilsted, Anil H.

    analysis. Metal-capped silicon nanopillars, fabricated through a maskless ion etch, are state-of-the-art for on-chip SERS substrates. A dense cluster of high aspect ratio polymer nanocones was achieved by using high-throughput polymer injection moulding over a large area replicating a silicon nanopillar...... structure. Gold-capped polymer nanocones display similar SERS sensitivity as silicon nanopillars, while being easily integrable into a microfluidic chips....

  13. DAVID Knowledgebase: a gene-centered database integrating heterogeneous gene annotation resources to facilitate high-throughput gene functional analysis

    Directory of Open Access Journals (Sweden)

    Baseler Michael W

    2007-11-01

    Full Text Available Abstract Background Due to the complex and distributed nature of biological research, our current biological knowledge is spread over many redundant annotation databases maintained by many independent groups. Analysts usually need to visit many of these bioinformatics databases in order to integrate comprehensive annotation information for their genes, which becomes one of the bottlenecks, particularly for the analytic task associated with a large gene list. Thus, a highly centralized and ready-to-use gene-annotation knowledgebase is in demand for high throughput gene functional analysis. Description The DAVID Knowledgebase is built around the DAVID Gene Concept, a single-linkage method to agglomerate tens of millions of gene/protein identifiers from a variety of public genomic resources into DAVID gene clusters. The grouping of such identifiers improves the cross-reference capability, particularly across NCBI and UniProt systems, enabling more than 40 publicly available functional annotation sources to be comprehensively integrated and centralized by the DAVID gene clusters. The simple, pair-wise, text format files which make up the DAVID Knowledgebase are freely downloadable for various data analysis uses. In addition, a well organized web interface allows users to query different types of heterogeneous annotations in a high-throughput manner. Conclusion The DAVID Knowledgebase is designed to facilitate high throughput gene functional analysis. For a given gene list, it not only provides the quick accessibility to a wide range of heterogeneous annotation data in a centralized location, but also enriches the level of biological information for an individual gene. Moreover, the entire DAVID Knowledgebase is freely downloadable or searchable at http://david.abcc.ncifcrf.gov/knowledgebase/.

  14. Acquisition and analysis of throughput rates for an operational department-wide PACS

    Science.gov (United States)

    Stewart, Brent K.; Taira, Ricky K.; Dwyer, Samuel J., III; Huang, H. K.

    1992-07-01

    The accurate prediction of image throughput is a critical issue in planning for and acquisition of any successful Picture Archiving and Communication System (PACS). Bottlenecks or design flaws can render an expensive PACS implementation useless. This manuscript presents a method for accurately predicting and measuring image throughput of a PACS design. To create the simulation model of the planned or implemented PACS, it must first be decomposed into principal tasks. We have decomposed the entire PACS image management chain into eight subsystems. These subsystems include network transfers over three different networks (Ethernet, FDDI and UltraNet) and five software programs and/or queues: (1) transfer of image data from the imaging modality computer to the image acquisition/reformatting computer; (2) reformatting the image data into a standard image format; (3) transferring the image data from the acquisition/reformatting computer to the image archive computer; (4) updating a relational database management system over the network; (5) image processing-- rotation and optimal gray-scale lookup table calculation; (6) request that the image be archived; (7) image transfer from the image archive computer to a designated image display workstation; and (8) update the local database on the image display station, separate the image header from the image data and store the image data on a parallel disk array. Through development of an event logging facility and implementation of a network management package we have acquired throughput data for each subsystem in the PACS chain. In addition, from our PACS relational database management system, we have distilled the traffic generation patterns (temporal, file size and destination) of our imaging modality devices. This data has been input into a simulation modeling package (Block Oriented Network Simulator-- BONeS) to estimate the characteristics of the modeled PACS, e.g., the throughput rates and delay time. This simulation

  15. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  16. Commentary: Roles for Pathologists in a High-throughput Image Analysis Team.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Bolon, Brad; Kanaly, Suzanne; Mahrt, Charles R; Rudmann, Dan; Charles, Elaine; Young, G David

    2016-08-01

    Historically, pathologists perform manual evaluation of H&E- or immunohistochemically-stained slides, which can be subjective, inconsistent, and, at best, semiquantitative. As the complexity of staining and demand for increased precision of manual evaluation increase, the pathologist's assessment will include automated analyses (i.e., "digital pathology") to increase the accuracy, efficiency, and speed of diagnosis and hypothesis testing and as an important biomedical research and diagnostic tool. This commentary introduces the many roles for pathologists in designing and conducting high-throughput digital image analysis. Pathology review is central to the entire course of a digital pathology study, including experimental design, sample quality verification, specimen annotation, analytical algorithm development, and report preparation. The pathologist performs these roles by reviewing work undertaken by technicians and scientists with training and expertise in image analysis instruments and software. These roles require regular, face-to-face interactions between team members and the lead pathologist. Traditional pathology training is suitable preparation for entry-level participation on image analysis teams. The future of pathology is very exciting, with the expanding utilization of digital image analysis set to expand pathology roles in research and drug development with increasing and new career opportunities for pathologists. © 2016 by The Author(s) 2016.

  17. High throughput protein production screening

    Science.gov (United States)

    Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  18. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Science.gov (United States)

    Inagaki, Soichi; Henry, Isabelle M; Lieberman, Meric C; Comai, Luca

    2015-01-01

    Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  19. Improving throughput of single-relay DF channel using linear constellation precoding

    KAUST Repository

    Fareed, Muhammad Mehboob

    2014-08-01

    In this letter, we propose a transmission scheme to improve the overall throughput of a cooperative communication system with single decode-and-forward relay. Symbol error rate and throughput analysis of the new scheme are presented to facilitate the performance comparison with the existing decode-and-forward relaying schemes. Simulation results are further provided to corroborate the analytical results. © 2012 IEEE.

  20. Improving throughput of single-relay DF channel using linear constellation precoding

    KAUST Repository

    Fareed, Muhammad Mehboob; Yang, Hongchuan; Alouini, Mohamed-Slim

    2014-01-01

    In this letter, we propose a transmission scheme to improve the overall throughput of a cooperative communication system with single decode-and-forward relay. Symbol error rate and throughput analysis of the new scheme are presented to facilitate the performance comparison with the existing decode-and-forward relaying schemes. Simulation results are further provided to corroborate the analytical results. © 2012 IEEE.

  1. elegantRingAnalysis An Interface for High-Throughput Analysis of Storage Ring Lattices Using elegant

    CERN Document Server

    Borland, Michael

    2005-01-01

    The code {\\tt elegant} is widely used for simulation of linacs for drivers for free-electron lasers. Less well known is that elegant is also a very capable code for simulation of storage rings. In this paper, we show a newly-developed graphical user interface that allows the user to easily take advantage of these capabilities. The interface is designed for use on a Linux cluster, providing very high throughput. It can also be used on a single computer. Among the features it gives access to are basic calculations (Twiss parameters, radiation integrals), phase-space tracking, nonlinear dispersion, dynamic aperture (on- and off-momentum), frequency map analysis, and collective effects (IBS, bunch-lengthening). Using a cluster, it is easy to get highly detailed dynamic aperture and frequency map results in a surprisingly short time.

  2. Design and Performance Analysis of Multi-tier Heterogeneous Network through Coverage, Throughput and Energy Efficiency

    Directory of Open Access Journals (Sweden)

    A. Shabbir,

    2017-12-01

    Full Text Available The unprecedented acceleration in wireless industry strongly compels wireless operators to increase their data network throughput, capacity and coverage on emergent basis. In upcoming 5G heterogeneous networks inclusion of low power nodes (LPNs like pico cells and femto cells for increasing network’s throughput, capacity and coverage are getting momentum. Addition of LPNs in such a massive level will eventually make a network populated in terms of base stations (BSs.The dense deployments of BSs will leads towards high operating expenditures (Op-Ex, capital expenditure (Cap-Ex and most importantly high energy consumption in future generation networks. Recognizing theses networks issues this research work investigates data throughput and energy efficiency of 5G multi-tier heterogeneous network. The network is modeled using tools from stochastic geometry. Monte Carlo results confirmed that rational deployment of LPNs can contribute towards increased throughput along with better energy efficiency of overall network.

  3. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    International Nuclear Information System (INIS)

    Tucci, P.

    2001-01-01

    This Analysis/Model Report (AMR) documents an updated analysis of water-level data performed to provide the saturated-zone, site-scale flow and transport model (CRWMS M and O 2000) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for model calibration. The previous analysis was presented in ANL-NBS-HS-000034, Rev 00 ICN 01, Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model (USGS 2001). This analysis is designed to use updated water-level data as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain. The objectives of this revision are to develop computer files containing (1) water-level data within the model area (DTN: GS010908312332.002), (2) a table of known vertical head differences (DTN: GS0109083 12332.003), and (3) a potentiometric-surface map (DTN: GS010608312332.001) using an alternate concept from that presented in ANL-NBS-HS-000034, Rev 00 ICN 01 for the area north of Yucca Mountain. The updated water-level data include data obtained from the Nye County Early Warning Drilling Program (EWDP) and data from borehole USW WT-24. In addition to being utilized by the SZ site-scale flow and transport model, the water-level data and potentiometric-surface map contained within this report will be available to other government agencies and water users for ground-water management purposes. The potentiometric surface defines an upper boundary of the site-scale flow model, as well as provides information useful to estimation of the magnitude and direction of lateral ground-water flow within the flow system. Therefore, the analysis documented in this revision is important to SZ flow and transport calculations in support of total system performance assessment

  4. A high-throughput screening system for barley/powdery mildew interactions based on automated analysis of light micrographs.

    Science.gov (United States)

    Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo

    2008-01-23

    To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.

  5. Criteria for saturated magnetization loop

    International Nuclear Information System (INIS)

    Harres, A.; Mikhov, M.; Skumryev, V.; Andrade, A.M.H. de; Schmidt, J.E.; Geshev, J.

    2016-01-01

    Proper estimation of magnetization curve parameters is vital in studying magnetic systems. In the present article, criteria for discrimination non-saturated (minor) from saturated (major) hysteresis loops are proposed. These employ the analysis of (i) derivatives of both ascending and descending branches of the loop, (ii) remanent magnetization curves, and (iii) thermomagnetic curves. Computational simulations are used in order to demonstrate their validity. Examples illustrating the applicability of these criteria to well-known real systems, namely Fe_3O_4 and Ni fine particles, are provided. We demonstrate that the anisotropy-field value estimated from a visual examination of an only apparently major hysteresis loop could be more than two times lower than the real one. - Highlights: • Proper estimation of hysteresis-loop parameters is vital in magnetic studies. • We propose criteria for discrimination minor from major hysteresis loops. • The criteria analyze magnetization, remanence and ZFC/FC curves and/or their derivatives. • Examples of their application on real nanoparticles systems are given. • Using the criteria could avoid twofold or bigger saturation-field underestimation errors.

  6. Criteria for saturated magnetization loop

    Energy Technology Data Exchange (ETDEWEB)

    Harres, A. [Departamento de Física, UFSM, Santa Maria, 97105-900 Rio Grande do Sul (Brazil); Mikhov, M. [Faculty of Physics, University of Sofia, 1164 Sofia (Bulgaria); Skumryev, V. [Institució Catalana de Recerca i Estudis Avançats, 08010 Barcelona (Spain); Departament de Física, Universitat Autònoma de Barcelona, 08193 Barcelona (Spain); Andrade, A.M.H. de; Schmidt, J.E. [Instituto de Física, UFRGS, Porto Alegre, 91501-970 Rio Grande do Sul (Brazil); Geshev, J., E-mail: julian@if.ufrgs.br [Departament de Física, Universitat Autònoma de Barcelona, 08193 Barcelona (Spain); Instituto de Física, UFRGS, Porto Alegre, 91501-970 Rio Grande do Sul (Brazil)

    2016-03-15

    Proper estimation of magnetization curve parameters is vital in studying magnetic systems. In the present article, criteria for discrimination non-saturated (minor) from saturated (major) hysteresis loops are proposed. These employ the analysis of (i) derivatives of both ascending and descending branches of the loop, (ii) remanent magnetization curves, and (iii) thermomagnetic curves. Computational simulations are used in order to demonstrate their validity. Examples illustrating the applicability of these criteria to well-known real systems, namely Fe{sub 3}O{sub 4} and Ni fine particles, are provided. We demonstrate that the anisotropy-field value estimated from a visual examination of an only apparently major hysteresis loop could be more than two times lower than the real one. - Highlights: • Proper estimation of hysteresis-loop parameters is vital in magnetic studies. • We propose criteria for discrimination minor from major hysteresis loops. • The criteria analyze magnetization, remanence and ZFC/FC curves and/or their derivatives. • Examples of their application on real nanoparticles systems are given. • Using the criteria could avoid twofold or bigger saturation-field underestimation errors.

  7. Retinal oxygen saturation before and after glaucoma surgery.

    Science.gov (United States)

    Nitta, Eri; Hirooka, Kazuyuki; Shimazaki, Takeru; Sato, Shino; Ukegawa, Kaori; Nakano, Yuki; Tsujikawa, Akitaka

    2017-08-01

    This study compared retinal vessel oxygen saturation before and after glaucoma surgery. Retinal oxygen saturation in glaucoma patients was measured using a non-invasive spectrophotometric retinal oximeter. Adequate image quality was found in 49 of the 108 consecutive glaucoma patients recruited, with 30 undergoing trabeculectomy, 11 EX-PRESS and eight trabeculotomy. Retinal oxygen saturation measurements in the retinal arterioles and venules were performed at 1 day prior to and at approximately 10 days after surgery. Statistical analysis was performed using a Student's t-test. After glaucoma surgery, intraocular pressure (IOP) decreased from 19.8 ± 7.7 mmHg to 9.0 ± 5.7 mmHg (p glaucoma surgery had an effect on the retinal venous oxygen saturation. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  8. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    Science.gov (United States)

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p , small n ' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  9. Saturated Zone Colloid Transport

    International Nuclear Information System (INIS)

    H. S. Viswanathan

    2004-01-01

    This scientific analysis provides retardation factors for colloids transporting in the saturated zone (SZ) and the unsaturated zone (UZ). These retardation factors represent the reversible chemical and physical filtration of colloids in the SZ. The value of the colloid retardation factor, R col is dependent on several factors, such as colloid size, colloid type, and geochemical conditions (e.g., pH, Eh, and ionic strength). These factors are folded into the distributions of R col that have been developed from field and experimental data collected under varying geochemical conditions with different colloid types and sizes. Attachment rate constants, k att , and detachment rate constants, k det , of colloids to the fracture surface have been measured for the fractured volcanics, and separate R col uncertainty distributions have been developed for attachment and detachment to clastic material and mineral grains in the alluvium. Radionuclides such as plutonium and americium sorb mostly (90 to 99 percent) irreversibly to colloids (BSC 2004 [DIRS 170025], Section 6.3.3.2). The colloid retardation factors developed in this analysis are needed to simulate the transport of radionuclides that are irreversibly sorbed onto colloids; this transport is discussed in the model report ''Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]). Although it is not exclusive to any particular radionuclide release scenario, this scientific analysis especially addresses those scenarios pertaining to evidence from waste-degradation experiments, which indicate that plutonium and americium may be irreversibly attached to colloids for the time scales of interest. A section of this report will also discuss the validity of using microspheres as analogs to colloids in some of the lab and field experiments used to obtain the colloid retardation factors. In addition, a small fraction of colloids travels with the groundwater without any significant retardation. Radionuclides irreversibly

  10. High-Throughput Analysis of T-DNA Location and Structure Using Sequence Capture.

    Directory of Open Access Journals (Sweden)

    Soichi Inagaki

    Full Text Available Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA-genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously, using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. Our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.

  11. Cyber-T web server: differential analysis of high-throughput data.

    Science.gov (United States)

    Kayala, Matthew A; Baldi, Pierre

    2012-07-01

    The Bayesian regularization method for high-throughput differential analysis, described in Baldi and Long (A Bayesian framework for the analysis of microarray expression data: regularized t-test and statistical inferences of gene changes. Bioinformatics 2001: 17: 509-519) and implemented in the Cyber-T web server, is one of the most widely validated. Cyber-T implements a t-test using a Bayesian framework to compute a regularized variance of the measurements associated with each probe under each condition. This regularized estimate is derived by flexibly combining the empirical measurements with a prior, or background, derived from pooling measurements associated with probes in the same neighborhood. This approach flexibly addresses problems associated with low replication levels and technology biases, not only for DNA microarrays, but also for other technologies, such as protein arrays, quantitative mass spectrometry and next-generation sequencing (RNA-seq). Here we present an update to the Cyber-T web server, incorporating several useful new additions and improvements. Several preprocessing data normalization options including logarithmic and (Variance Stabilizing Normalization) VSN transforms are included. To augment two-sample t-tests, a one-way analysis of variance is implemented. Several methods for multiple tests correction, including standard frequentist methods and a probabilistic mixture model treatment, are available. Diagnostic plots allow visual assessment of the results. The web server provides comprehensive documentation and example data sets. The Cyber-T web server, with R source code and data sets, is publicly available at http://cybert.ics.uci.edu/.

  12. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas; Castro, David; Foulds, Ian G.

    2013-01-01

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  13. Towards a high throughput droplet-based agglutination assay

    KAUST Repository

    Kodzius, Rimantas

    2013-10-22

    This work demonstrates the detection method for a high throughput droplet based agglutination assay system. Using simple hydrodynamic forces to mix and aggregate functionalized microbeads we avoid the need to use magnetic assistance or mixing structures. The concentration of our target molecules was estimated by agglutination strength, obtained through optical image analysis. Agglutination in droplets was performed with flow rates of 150 µl/min and occurred in under a minute, with potential to perform high-throughput measurements. The lowest target concentration detected in droplet microfluidics was 0.17 nM, which is three orders of magnitude more sensitive than a conventional card based agglutination assay.

  14. Fairness analysis of throughput and delay in WLAN environments with channel diversities

    Directory of Open Access Journals (Sweden)

    Fang Shih-Hau

    2011-01-01

    Full Text Available Abstract The article investigates fairness in terms of throughput and packet delays among users with diverse channel conditions due to the mobility and fading effects in IEEE 802.11 WLAN (wireless local area networks environments. From our analytical results, it is shown that 802.11 CSMA/CA can present fairness among hosts with identical link qualities regardless of equal or different data rates applied. Our analytical results further demonstrate that the presence of diverse channel conditions can pose significant unfairness on both throughput and packet delays even with a link adaptation mechanism since the MCSs (modulation and coding schemes available are limited. The simulation results validate the accuracy of our analytical model.

  15. An Analysis and Design for Nonlinear Quadratic Systems Subject to Nested Saturation

    Directory of Open Access Journals (Sweden)

    Minsong Zhang

    2013-01-01

    Full Text Available This paper considers the stability problem for nonlinear quadratic systems with nested saturation input. The interesting treatment method proposed to nested saturation here is put into use a well-established linear differential control tool. And the new conclusions include the existing conclusion on this issue and have less conservatism than before. Simulation example illustrates the effectiveness of the established methodologies.

  16. Nonlinear Gain Saturation in Active Slow Light Photonic Crystal Waveguides

    DEFF Research Database (Denmark)

    Chen, Yaohui; Mørk, Jesper

    2013-01-01

    We present a quantitative three-dimensional analysis of slow-light enhanced traveling wave amplification in an active semiconductor photonic crystal waveguides. The impact of slow-light propagation on the nonlinear gain saturation of the device is investigated.......We present a quantitative three-dimensional analysis of slow-light enhanced traveling wave amplification in an active semiconductor photonic crystal waveguides. The impact of slow-light propagation on the nonlinear gain saturation of the device is investigated....

  17. Gluon saturation in a saturated environment

    International Nuclear Information System (INIS)

    Kopeliovich, B. Z.; Potashnikova, I. K.; Schmidt, Ivan

    2011-01-01

    A bootstrap equation for self-quenched gluon shadowing leads to a reduced magnitude of broadening for partons propagating through a nucleus. Saturation of small-x gluons in a nucleus, which has the form of transverse momentum broadening of projectile gluons in pA collisions in the nuclear rest frame, leads to a modification of the parton distribution functions in the beam compared with pp collisions. In nucleus-nucleus collisions all participating nucleons acquire enhanced gluon density at small x, which boosts further the saturation scale. Solution of the reciprocity equations for central collisions of two heavy nuclei demonstrate a significant, up to several times, enhancement of Q sA 2 , in AA compared with pA collisions.

  18. High-throughput tandem mass spectrometry multiplex analysis for newborn urinary screening of creatine synthesis and transport disorders, Triple H syndrome and OTC deficiency.

    Science.gov (United States)

    Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela

    2014-09-25

    Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. High-throughput continuous cryopump

    International Nuclear Information System (INIS)

    Foster, C.A.

    1986-01-01

    A cryopump with a unique method of regeneration which allows continuous operation at high throughput has been constructed and tested. Deuterium was pumped continuously at a throughput of 30 Torr.L/s at a speed of 2000 L/s and a compression ratio of 200. Argon was pumped at a throughput of 60 Torr.L/s at a speed of 1275 L/s. To produce continuous operation of the pump, a method of regeneration that does not thermally cycle the pump is employed. A small chamber (the ''snail'') passes over the pumping surface and removes the frost from it either by mechanical action with a scraper or by local heating. The material removed is topologically in a secondary vacuum system with low conductance into the primary vacuum; thus, the exhaust can be pumped at pressures up to an effective compression ratio determined by the ratio of the pumping speed to the leakage conductance of the snail. The pump, which is all-metal-sealed and dry and which regenerates every 60 s, would be an ideal system for pumping tritium. Potential fusion applications are for mpmp limiters, for repeating pneumatic pellet injection lines, and for the centrifuge pellet injector spin tank, all of which will require pumping tritium at high throughput. Industrial applications requiring ultraclean pumping of corrosive gases at high throughput, such as the reactive ion etch semiconductor process, may also be feasible

  20. Analysis of saturation effects on the operation of magnetic-controlled switcher type FCL

    Directory of Open Access Journals (Sweden)

    Faramarz Faghihi

    2009-12-01

    Full Text Available With the extensive application of electrical power system, suppression of fault current limiter is an important subject that guarantees system security. The superconducting fault current limiters (SFCL have been expected as a possible type of power apparatus to reduce the fault current in the power system. The results shown that under normal state, the FCL has no obvious effect on the power system; under fault state, the current limiting inductance connected in the bias current will be inserted into the fault circuit to limit the fault current. By regulating the bias current, the FCL voltage loss under normal state and the fault current can be adjusted to prescribed level. This kind of SFCL used the nonlinear permeability of the magnetic core for create a sufficient impedance and The transient performance considering the magnetic saturation is analyzed by Preisach model. Preisach model that intrinsically satisfies nonlinear properties is used as the numerical method for analysis of saturation effects. It is able to identification isotropic and no isotropic behaviour. The main idea is to compute the magnetization vector in two steps independently, amplitude and phase. The described model yield results in qualitative agreement with the experimental results.

  1. Lipid order, saturation and surface property relationships: a study of human meibum saturation.

    Science.gov (United States)

    Mudgil, Poonam; Borchman, Douglas; Yappert, Marta C; Duran, Diana; Cox, Gregory W; Smith, Ryan J; Bhola, Rahul; Dennis, Gary R; Whitehall, John S

    2013-11-01

    Tear film stability decreases with age however the cause(s) of the instability are speculative. Perhaps the more saturated meibum from infants may contribute to tear film stability. The meibum lipid phase transition temperature and lipid hydrocarbon chain order at physiological temperature (33 °C) decrease with increasing age. It is reasonable that stronger lipid-lipid interactions could stabilize the tear film since these interactions must be broken for tear break up to occur. In this study, meibum from a pool of adult donors was saturated catalytically. The influence of saturation on meibum hydrocarbon chain order was determined by infrared spectroscopy. Meibum is in an anhydrous state in the meibomian glands and on the surface of the eyelid. The influence of saturation on the surface properties of meibum was determined using Langmuir trough technology. Saturation of native human meibum did not change the minimum or maximum values of hydrocarbon chain order so at temperatures far above or below the phase transition of human meibum, saturation does not play a role in ordering or disordering the lipid hydrocarbon chains. Saturation did increase the phase transition temperature in human meibum by over 20 °C, a relatively high amount. Surface pressure-area studies showing the late take off and higher maximum surface pressure of saturated meibum compared to native meibum suggest that the saturated meibum film is quite molecularly ordered (stiff molecular arrangement) and elastic (molecules are able to rearrange during compression and expansion) compared with native meibum films which are more fluid agreeing with the infrared spectroscopic results of this study. In saturated meibum, the formation of compacted ordered islands of lipids above the surfactant layer would be expected to decrease the rate of evaporation compared to fluid and more loosely packed native meibum. Higher surface pressure observed with films of saturated meibum compared to native meibum

  2. Simultaneous analysis of saturated and unsaturated fatty acids present in pequi fruits by capillary electrophoresis

    Directory of Open Access Journals (Sweden)

    Patrícia M. de Castro Barra

    2013-01-01

    Full Text Available In the current study, an alternative method has been proposed for simultaneous analysis of palmitic, stearic, oleic, linoleic, and linolenic acids by capillary zone electrophoresis (CZE using indirect detection. The background electrolyte (BGE used for the analysis of these fatty acids (FAs consisted of 15.0 mmol L−1 NaH2PO4/Na2HPO4 at pH 6.86, 4.0 mmol L−1 SDBS, 8.3 mmol L−1 Brij 35, 45% v/v acetonitrile (can, and 2.1% n-octanol. The FAs quantification of FAs was performed using a response factor approach, which provided a high analytical throughput for the real sample. The CZE method, which was applied successfully for the analysis of pequi pulp, has advantages such as short analysis time, absence of lipid fraction extraction and derivatization steps, and no significant difference in the 95% confidence intervals for FA quantification results, compared to the gas chromatography official method (AOCS Ce 1h-05.

  3. A sensitivity analysis on seismic tomography data with respect to CO2 saturation of a CO2 geological sequestration field

    Science.gov (United States)

    Park, Chanho; Nguyen, Phung K. T.; Nam, Myung Jin; Kim, Jongwook

    2013-04-01

    Monitoring CO2 migration and storage in geological formations is important not only for the stability of geological sequestration of CO2 but also for efficient management of CO2 injection. Especially, geophysical methods can make in situ observation of CO2 to assess the potential leakage of CO2 and to improve reservoir description as well to monitor development of geologic discontinuity (i.e., fault, crack, joint, etc.). Geophysical monitoring can be based on wireline logging or surface surveys for well-scale monitoring (high resolution and nallow area of investigation) or basin-scale monitoring (low resolution and wide area of investigation). In the meantime, crosswell tomography can make reservoir-scale monitoring to bridge the resolution gap between well logs and surface measurements. This study focuses on reservoir-scale monitoring based on crosswell seismic tomography aiming describe details of reservoir structure and monitoring migration of reservoir fluid (water and CO2). For the monitoring, we first make a sensitivity analysis on crosswell seismic tomography data with respect to CO2 saturation. For the sensitivity analysis, Rock Physics Models (RPMs) are constructed by calculating the values of density and P and S-wave velocities of a virtual CO2 injection reservoir. Since the seismic velocity of the reservoir accordingly changes as CO2 saturation changes when the CO2 saturation is less than about 20%, while when the CO2 saturation is larger than 20%, the seismic velocity is insensitive to the change, sensitivity analysis is mainly made when CO2 saturation is less than 20%. For precise simulation of seismic tomography responses for constructed RPMs, we developed a time-domain 2D elastic modeling based on finite difference method with a staggered grid employing a boundary condition of a convolutional perfectly matched layer. We further make comparison between sensitivities of seismic tomography and surface measurements for RPMs to analysis resolution

  4. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    Science.gov (United States)

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  5. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  6. Noise and saturation properties of semiconductor quantum dot optical amplifiers

    DEFF Research Database (Denmark)

    Berg, Tommy Winther; Mørk, Jesper

    2002-01-01

    We present a detailed theoretical analysis of quantum dot optical amplifiers. Due to the presence of a reservoir of wetting layer states, the saturation and noise properties differ markedly from bulk or QW amplifiers and may be significantly improved.......We present a detailed theoretical analysis of quantum dot optical amplifiers. Due to the presence of a reservoir of wetting layer states, the saturation and noise properties differ markedly from bulk or QW amplifiers and may be significantly improved....

  7. Determination of saturation functions and wettability for chalk based on measured fluid saturations

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, D.; Bech, N.; Moeller Nielsen, C.

    1998-08-01

    The end effect of displacement experiments on low permeable porous media is used for determination of relative permeability functions and capillary pressure functions. Saturation functions for a drainage process are determined from a primary drainage experiment. A reversal of the flooding direction creates an intrinsic imbibition process in the sample, which enables determination if imbibition saturation functions. The saturation functions are determined by a parameter estimation technique. Scanning effects are modelled by the method of Killough. Saturation profiles are determined by NMR. (au)

  8. Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.

    Science.gov (United States)

    Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo

    2011-12-15

    High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.

  9. Throughput rate study

    International Nuclear Information System (INIS)

    Ford, L.; Bailey, W.; Gottlieb, P.; Emami, F.; Fleming, M.; Robertson, D.

    1993-01-01

    The Civilian Radioactive Waste Management System (CRWMS) Management and Operating (M ampersand O) Contractor, has completed a study to analyze system wide impacts of operating the CRWMS at varying throughput rates, including the 3000 MTU/year rate which has been assumed in the past. Impacts of throughput rate on all phases of the CRWMS operations (acceptance, transportation, storage and disposal) were evaluated. The results of the study indicate that a range from 3000 to 5000 MTU/year is preferred, based on system cost per MTU of SNF emplaced and logistics constraints

  10. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay

    Science.gov (United States)

    Zhong, Xuefeng; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-01

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter–receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter–receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity. PMID:29337911

  11. Throughput Analysis on 3-Dimensional Underwater Acoustic Network with One-Hop Mobile Relay.

    Science.gov (United States)

    Zhong, Xuefeng; Chen, Fangjiong; Fan, Jiasheng; Guan, Quansheng; Ji, Fei; Yu, Hua

    2018-01-16

    Underwater acoustic communication network (UACN) has been considered as an essential infrastructure for ocean exploitation. Performance analysis of UACN is important in underwater acoustic network deployment and management. In this paper, we analyze the network throughput of three-dimensional randomly deployed transmitter-receiver pairs. Due to the long delay of acoustic channels, complicated networking protocols with heavy signaling overhead may not be appropriate. In this paper, we consider only one-hop or two-hop transmission, to save the signaling cost. That is, we assume the transmitter sends the data packet to the receiver by one-hop direct transmission, or by two-hop transmission via mobile relays. We derive the closed-form formulation of packet delivery rate with respect to the transmission delay and the number of transmitter-receiver pairs. The correctness of the derivation results are verified by computer simulations. Our analysis indicates how to obtain a precise tradeoff between the delay constraint and the network capacity.

  12. Saturated Zone Colloid Transport

    Energy Technology Data Exchange (ETDEWEB)

    H. S. Viswanathan

    2004-10-07

    This scientific analysis provides retardation factors for colloids transporting in the saturated zone (SZ) and the unsaturated zone (UZ). These retardation factors represent the reversible chemical and physical filtration of colloids in the SZ. The value of the colloid retardation factor, R{sub col} is dependent on several factors, such as colloid size, colloid type, and geochemical conditions (e.g., pH, Eh, and ionic strength). These factors are folded into the distributions of R{sub col} that have been developed from field and experimental data collected under varying geochemical conditions with different colloid types and sizes. Attachment rate constants, k{sub att}, and detachment rate constants, k{sub det}, of colloids to the fracture surface have been measured for the fractured volcanics, and separate R{sub col} uncertainty distributions have been developed for attachment and detachment to clastic material and mineral grains in the alluvium. Radionuclides such as plutonium and americium sorb mostly (90 to 99 percent) irreversibly to colloids (BSC 2004 [DIRS 170025], Section 6.3.3.2). The colloid retardation factors developed in this analysis are needed to simulate the transport of radionuclides that are irreversibly sorbed onto colloids; this transport is discussed in the model report ''Site-Scale Saturated Zone Transport'' (BSC 2004 [DIRS 170036]). Although it is not exclusive to any particular radionuclide release scenario, this scientific analysis especially addresses those scenarios pertaining to evidence from waste-degradation experiments, which indicate that plutonium and americium may be irreversibly attached to colloids for the time scales of interest. A section of this report will also discuss the validity of using microspheres as analogs to colloids in some of the lab and field experiments used to obtain the colloid retardation factors. In addition, a small fraction of colloids travels with the groundwater without any significant

  13. Pheno2Geno - High-throughput generation of genetic markers and maps from molecular phenotypes for crosses between inbred strains.

    Science.gov (United States)

    Zych, Konrad; Li, Yang; van der Velde, Joeri K; Joosen, Ronny V L; Ligterink, Wilco; Jansen, Ritsert C; Arends, Danny

    2015-02-19

    Genetic markers and maps are instrumental in quantitative trait locus (QTL) mapping in segregating populations. The resolution of QTL localization depends on the number of informative recombinations in the population and how well they are tagged by markers. Larger populations and denser marker maps are better for detecting and locating QTLs. Marker maps that are initially too sparse can be saturated or derived de novo from high-throughput omics data, (e.g. gene expression, protein or metabolite abundance). If these molecular phenotypes are affected by genetic variation due to a major QTL they will show a clear multimodal distribution. Using this information, phenotypes can be converted into genetic markers. The Pheno2Geno tool uses mixture modeling to select phenotypes and transform them into genetic markers suitable for construction and/or saturation of a genetic map. Pheno2Geno excludes candidate genetic markers that show evidence for multiple possibly epistatically interacting QTL and/or interaction with the environment, in order to provide a set of robust markers for follow-up QTL mapping. We demonstrate the use of Pheno2Geno on gene expression data of 370,000 probes in 148 A. thaliana recombinant inbred lines. Pheno2Geno is able to saturate the existing genetic map, decreasing the average distance between markers from 7.1 cM to 0.89 cM, close to the theoretical limit of 0.68 cM (with 148 individuals we expect a recombination every 100/148=0.68 cM); this pinpointed almost all of the informative recombinations in the population. The Pheno2Geno package makes use of genome-wide molecular profiling and provides a tool for high-throughput de novo map construction and saturation of existing genetic maps. Processing of the showcase dataset takes less than 30 minutes on an average desktop PC. Pheno2Geno improves QTL mapping results at no additional laboratory cost and with minimum computational effort. Its results are formatted for direct use in R/qtl, the leading R

  14. A Perturbation Analysis of Harmonics Generation from Saturated Elements in Power Systems

    Science.gov (United States)

    Kumano, Teruhisa

    Nonlinear phenomena such as saturation in magnetic flux give considerable effects in power system analysis. It is reported that a failure in a real 500kV system triggered islanding operation, where resultant even harmonics caused malfunctions in protective relays. It is also reported that the major origin of this wave distortion is nothing but unidirectional magnetization of the transformer iron core. Time simulation is widely used today to analyze this type of phenomena, but it has basically two shortcomings. One is that the time simulation takes two much computing time in the vicinity of inflection points in the saturation characteristic curve because certain iterative procedure such as N-R (Newton-Raphson) should be used and such methods tend to be caught in an ill conditioned numerical hunting. The other is that such simulation methods sometimes do not help intuitive understanding of the studied phenomenon because the whole nonlinear equations are treated in a matrix form and not properly divided into understandable parts as done in linear systems. This paper proposes a new computation scheme which is based on so called perturbation method. Magnetic saturation in iron cores in a generator and a transformer are taken into account. The proposed method has a special feature against the first shortcoming of the N-R based time simulation method stated above. In the proposed method no iterative process is used to reduce the equation residue but uses perturbation series, which means free from the ill condition problem. Users have only to calculate each perturbation terms one by one until he reaches necessary accuracy. In a numerical example treated in the present paper the first order perturbation can make reasonably high accuracy, which means very fast computing. In numerical study three nonlinear elements are considered. Calculated results are almost identical to the conventional Newton-Raphson based time simulation, which shows the validity of the method. The

  15. High throughput integrated thermal characterization with non-contact optical calorimetry

    Science.gov (United States)

    Hou, Sichao; Huo, Ruiqing; Su, Ming

    2017-10-01

    Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.

  16. Mechanics of non-saturated soils

    International Nuclear Information System (INIS)

    Coussy, O.; Fleureau, J.M.

    2002-01-01

    This book presents the different ways to approach the mechanics of non saturated soils, from the physico-chemical aspect to the mechanical aspect, from the experiment to the theoretical modeling, from the laboratory to the workmanship, and from the microscopic scale to the macroscopic one. Content: water and its representation; experimental bases of the behaviour of non-saturated soils; transfer laws in non-saturated environment; energy approach of the behaviour of non-saturated soils; homogenization for the non-saturated soils; plasticity and hysteresis; dams and backfilling; elaborated barriers. (J.S.)

  17. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    Science.gov (United States)

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  18. Torque Analysis With Saturation Effects for Non-Salient Single-Phase Permanent-Magnet Machines

    DEFF Research Database (Denmark)

    Lu, Kaiyuan; Ritchie, Ewen

    2011-01-01

    The effects of saturation on torque production for non-salient, single-phase, permanent-magnet machines are studied in this paper. An analytical torque equation is proposed to predict the instantaneous torque with saturation effects. Compared to the existing methods, it is computationally faster......-element results, and experimental results obtained on a prototype single-phase permanent-magnet machine....

  19. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  20. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  1. Quantitative digital image analysis of chromogenic assays for high throughput screening of alpha-amylase mutant libraries.

    Science.gov (United States)

    Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy

    2009-08-01

    An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.

  2. Femtosecond all-optical parallel logic gates based on tunable saturable to reverse saturable absorption in graphene-oxide thin films

    International Nuclear Information System (INIS)

    Roy, Sukhdev; Yadav, Chandresh

    2013-01-01

    A detailed theoretical analysis of ultrafast transition from saturable absorption (SA) to reverse saturable absorption (RSA) has been presented in graphene-oxide thin films with femtosecond laser pulses at 800 nm. Increase in pulse intensity leads to switching from SA to RSA with increased contrast due to two-photon absorption induced excited-state absorption. Theoretical results are in good agreement with reported experimental results. Interestingly, it is also shown that increase in concentration results in RSA to SA transition. The switching has been optimized to design parallel all-optical femtosecond NOT, AND, OR, XOR, and the universal NAND and NOR logic gates

  3. High-Throughput Scoring of Seed Germination.

    Science.gov (United States)

    Ligterink, Wilco; Hilhorst, Henk W M

    2017-01-01

    High-throughput analysis of seed germination for phenotyping large genetic populations or mutant collections is very labor intensive and would highly benefit from an automated setup. Although very often used, the total germination percentage after a nominated period of time is not very informative as it lacks information about start, rate, and uniformity of germination, which are highly indicative of such traits as dormancy, stress tolerance, and seed longevity. The calculation of cumulative germination curves requires information about germination percentage at various time points. We developed the GERMINATOR package: a simple, highly cost-efficient, and flexible procedure for high-throughput automatic scoring and evaluation of germination that can be implemented without the use of complex robotics. The GERMINATOR package contains three modules: (I) design of experimental setup with various options to replicate and randomize samples; (II) automatic scoring of germination based on the color contrast between the protruding radicle and seed coat on a single image; and (III) curve fitting of cumulative germination data and the extraction, recap, and visualization of the various germination parameters. GERMINATOR is a freely available package that allows the monitoring and analysis of several thousands of germination tests, several times a day by a single person.

  4. High-throughput analysis of amino acids in plant materials by single quadrupole mass spectrometry

    DEFF Research Database (Denmark)

    Dahl-Lassen, Rasmus; van Hecke, Jan Julien Josef; Jørgensen, Henning

    2018-01-01

    that it is very time consuming with typical chromatographic run times of 70 min or more. Results: We have here developed a high-throughput method for analysis of amino acid profiles in plant materials. The method combines classical protein hydrolysis and derivatization with fast separation by UHPLC and detection...... reducing the overall analytical costs compared to methods based on more advanced mass spectrometers....... by a single quadrupole (QDa) mass spectrometer. The chromatographic run time is reduced to 10 min and the precision, accuracy and sensitivity of the method are in line with other recent methods utilizing advanced and more expensive mass spectrometers. The sensitivity of the method is at least a factor 10...

  5. Modeling of gain saturation effects in active semiconductor photonic crystal waveguides

    DEFF Research Database (Denmark)

    Chen, Yaohui; Mørk, Jesper

    2012-01-01

    In this paper, we present a theoretical analysis of slow-light enhanced light amplification in an active semiconductor photonic crystal line defect waveguide. The impact of enhanced light-matter interactions on carrier-depletion-induced modal gain saturation is investigated.......In this paper, we present a theoretical analysis of slow-light enhanced light amplification in an active semiconductor photonic crystal line defect waveguide. The impact of enhanced light-matter interactions on carrier-depletion-induced modal gain saturation is investigated....

  6. Green throughput taxation

    International Nuclear Information System (INIS)

    Bruvoll, A.; Ibenholt, K.

    1998-01-01

    According to optimal taxation theory, raw materials should be taxed to capture the embedded scarcity rent in their value. To reduce both natural resource use and the corresponding emissions, or the throughput in the economic system, the best policy may be a tax on material inputs. As a first approach to throughput taxation, this paper considers a tax on intermediates in the framework of a dynamic computable general equilibrium model with environmental feedbacks. To balance the budget, payroll taxes are reduced. As a result, welfare indicators as material consumption and leisure time consumption are reduced, while on the other hand all the environmental indicators improve. 27 refs

  7. Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Hura, Greg L.; Menon, Angeli L.; Hammel, Michal; Rambo, Robert P.; Poole II, Farris L.; Tsutakawa, Susan E.; Jenney Jr, Francis E.; Classen, Scott; Frankel, Kenneth A.; Hopkins, Robert C.; Yang, Sungjae; Scott, Joseph W.; Dillard, Bret D.; Adams, Michael W. W.; Tainer, John A.

    2009-07-20

    We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes for 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.

  8. Bulk segregant analysis by high-throughput sequencing reveals a novel xylose utilization gene from Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jared W Wenger

    2010-05-01

    Full Text Available Fermentation of xylose is a fundamental requirement for the efficient production of ethanol from lignocellulosic biomass sources. Although they aggressively ferment hexoses, it has long been thought that native Saccharomyces cerevisiae strains cannot grow fermentatively or non-fermentatively on xylose. Population surveys have uncovered a few naturally occurring strains that are weakly xylose-positive, and some S. cerevisiae have been genetically engineered to ferment xylose, but no strain, either natural or engineered, has yet been reported to ferment xylose as efficiently as glucose. Here, we used a medium-throughput screen to identify Saccharomyces strains that can increase in optical density when xylose is presented as the sole carbon source. We identified 38 strains that have this xylose utilization phenotype, including strains of S. cerevisiae, other sensu stricto members, and hybrids between them. All the S. cerevisiae xylose-utilizing strains we identified are wine yeasts, and for those that could produce meiotic progeny, the xylose phenotype segregates as a single gene trait. We mapped this gene by Bulk Segregant Analysis (BSA using tiling microarrays and high-throughput sequencing. The gene is a putative xylitol dehydrogenase, which we name XDH1, and is located in the subtelomeric region of the right end of chromosome XV in a region not present in the S288c reference genome. We further characterized the xylose phenotype by performing gene expression microarrays and by genetically dissecting the endogenous Saccharomyces xylose pathway. We have demonstrated that natural S. cerevisiae yeasts are capable of utilizing xylose as the sole carbon source, characterized the genetic basis for this trait as well as the endogenous xylose utilization pathway, and demonstrated the feasibility of BSA using high-throughput sequencing.

  9. Independent component analysis applied to pulse oximetry in the estimation of the arterial oxygen saturation (SpO2) - a comparative study

    DEFF Research Database (Denmark)

    Jensen, Thomas; Duun, Sune Bro; Larsen, Jan

    2009-01-01

    We examine various independent component analysis (ICA) digital signal processing algorithms for estimating the arterial oxygen saturation (SpO2) as measured by a reflective pulse oximeter. The ICA algorithms examined are FastICA, Maximum Likelihood ICA (ICAML), Molgedey and Schuster ICA (ICAMS......), and Mean Field ICA (ICAMF). The signal processing includes pre-processing bandpass filtering to eliminate noise, and post-processing by calculating the SpO2. The algorithms are compared to the commercial state-of-the-art algorithm Discrete Saturation Transform (DST) by Masimo Corporation...

  10. SITE-SCALE SATURATED ZONE TRANSPORT

    International Nuclear Information System (INIS)

    S. KELLER

    2004-01-01

    This work provides a site-scale transport model for calculating radionuclide transport in the saturated zone (SZ) at Yucca Mountain, for use in the abstractions model in support of ''Total System Performance Assessment for License Application'' (TSPA-LA). The purpose of this model report is to provide documentation for the components of the site-scale SZ transport model in accordance with administrative procedure AP-SIII.10Q, Models. The initial documentation of this model report was conducted under the ''Technical Work Plan For: Saturated Zone Flow and Transport Modeling and Testing'' (BSC 2003 [DIRS 163965]). The model report has been revised in accordance with the ''Technical Work Plan For: Natural System--Saturated Zone Analysis and Model Report Integration'', Section 2.1.1.4 (BSC 2004 [DIRS 171421]) to incorporate Regulatory Integration Team comments. All activities listed in the technical work plan that are appropriate to the transport model are documented in this report and are described in Section 2.1.1.4 (BSC 2004 [DIRS 171421]). This report documents: (1) the advection-dispersion transport model including matrix diffusion (Sections 6.3 and 6.4); (2) a description and validation of the transport model (Sections 6.3 and 7); (3) the numerical methods for simulating radionuclide transport (Section 6.4); (4) the parameters (sorption coefficient, Kd ) and their uncertainty distributions used for modeling radionuclide sorption (Appendices A and C); (5) the parameters used for modeling colloid-facilitated radionuclide transport (Table 4-1, Section 6.4.2.6, and Appendix B); and (6) alternative conceptual models and their dispositions (Section 6.6). The intended use of this model is to simulate transport in saturated fractured porous rock (double porosity) and alluvium. The particle-tracking method of simulating radionuclide transport is incorporated in the finite-volume heat and mass transfer numerical analysis (FEHM) computer code, (FEHM V2.20, STN: 10086

  11. High-throughput characterization methods for lithium batteries

    Directory of Open Access Journals (Sweden)

    Yingchun Lyu

    2017-09-01

    Full Text Available The development of high-performance lithium ion batteries requires the discovery of new materials and the optimization of key components. By contrast with traditional one-by-one method, high-throughput method can synthesize and characterize a large number of compositionally varying samples, which is able to accelerate the pace of discovery, development and optimization process of materials. Because of rapid progress in thin film and automatic control technologies, thousands of compounds with different compositions could be synthesized rapidly right now, even in a single experiment. However, the lack of rapid or combinatorial characterization technologies to match with high-throughput synthesis methods, limit the application of high-throughput technology. Here, we review a series of representative high-throughput characterization methods used in lithium batteries, including high-throughput structural and electrochemical characterization methods and rapid measuring technologies based on synchrotron light sources.

  12. Femoral venous oxygen saturation is no surrogate for central venous oxygen saturation

    NARCIS (Netherlands)

    van Beest, Paul A.; van der Schors, Alice; Liefers, Henriëtte; Coenen, Ludo G. J.; Braam, Richard L.; Habib, Najib; Braber, Annemarije; Scheeren, Thomas W. L.; Kuiper, Michaël A.; Spronk, Peter E.

    2012-01-01

    Objective: The purpose of our study was to determine if central venous oxygen saturation and femoral venous oxygen saturation can be used interchangeably during surgery and in critically ill patients. Design: Prospective observational controlled study. Setting: Nonacademic university-affiliated

  13. Pathway Processor 2.0: a web resource for pathway-based analysis of high-throughput data.

    Science.gov (United States)

    Beltrame, Luca; Bianco, Luca; Fontana, Paolo; Cavalieri, Duccio

    2013-07-15

    Pathway Processor 2.0 is a web application designed to analyze high-throughput datasets, including but not limited to microarray and next-generation sequencing, using a pathway centric logic. In addition to well-established methods such as the Fisher's test and impact analysis, Pathway Processor 2.0 offers innovative methods that convert gene expression into pathway expression, leading to the identification of differentially regulated pathways in a dataset of choice. Pathway Processor 2.0 is available as a web service at http://compbiotoolbox.fmach.it/pathwayProcessor/. Sample datasets to test the functionality can be used directly from the application. duccio.cavalieri@fmach.it Supplementary data are available at Bioinformatics online.

  14. Student throughput variables and properties: Varying cohort sizes

    Directory of Open Access Journals (Sweden)

    Lucas C.A. Stoop

    2017-11-01

    Full Text Available A recent research paper described how student throughput variables and properties combine to explain the behaviour of stationary or simplified throughput systems. Such behaviour can be understood in terms of the locus of a point in the triangular admissible region of the H-S plane, where H represents headcounts and S successful credits, each depending on the system properties at that point. The efficiency of the student throughput process is given by the ratio S/H. Simplified throughput systems are characterised by stationary graduation and dropout patterns of students as well as by annual intakes of student cohorts of equal size. The effect of varying the size of the annual intakes of student cohorts is reported on here. The observations made lead to the establishment of a more generalised student throughput theory which includes the simplified theory as a special case. The generalised theory still retains the notion of a triangular admissible region in the H-S plane but with the size and shape of the triangle depending on the size of the student cohorts. The ratio S/H again emerges as the process efficiency measure for throughput systems in general with unchanged roles assigned to important system properties. This theory provides for a more fundamental understanding of student throughput systems encountered in real life. Significance: A generalised stationary student throughput theory through varying cohort sizes allows for a far better understanding of real student throughput systems.

  15. A Barcoding Strategy Enabling Higher-Throughput Library Screening by Microscopy.

    Science.gov (United States)

    Chen, Robert; Rishi, Harneet S; Potapov, Vladimir; Yamada, Masaki R; Yeh, Vincent J; Chow, Thomas; Cheung, Celia L; Jones, Austin T; Johnson, Terry D; Keating, Amy E; DeLoache, William C; Dueber, John E

    2015-11-20

    Dramatic progress has been made in the design and build phases of the design-build-test cycle for engineering cells. However, the test phase usually limits throughput, as many outputs of interest are not amenable to rapid analytical measurements. For example, phenotypes such as motility, morphology, and subcellular localization can be readily measured by microscopy, but analysis of these phenotypes is notoriously slow. To increase throughput, we developed microscopy-readable barcodes (MiCodes) composed of fluorescent proteins targeted to discernible organelles. In this system, a unique barcode can be genetically linked to each library member, making possible the parallel analysis of phenotypes of interest via microscopy. As a first demonstration, we MiCoded a set of synthetic coiled-coil leucine zipper proteins to allow an 8 × 8 matrix to be tested for specific interactions in micrographs consisting of mixed populations of cells. A novel microscopy-readable two-hybrid fluorescence localization assay for probing candidate interactions in the cytosol was also developed using a bait protein targeted to the peroxisome and a prey protein tagged with a fluorescent protein. This work introduces a generalizable, scalable platform for making microscopy amenable to higher-throughput library screening experiments, thereby coupling the power of imaging with the utility of combinatorial search paradigms.

  16. On the water saturation calculation in hydrocarbon sandstone reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Stalheim, Stein Ottar

    2002-07-01

    The main goal of this work was to identify the most important uncertainty sources in water saturation calculation and examine the possibility for developing new S{sub w} - equations or possibility to develop methods to remove weaknesses and uncertainties in existing S{sub w} - equations. Due to the need for industrial applicability of the equations we aimed for results with the following properties: The accuracy in S{sub w} should increase compared with existing S{sub w} - equations. The equations should be simple to use in petrophysical evaluations. The equations should be based on conventional logs and use as few as possible input parameters. The equations should be numerical stable. This thesis includes an uncertainty and sensitivity analysis of the most common S{sub w} equations. The results are addressed in chapter 3 and were intended to find the most important uncertainty sources in water saturation calculation. To increase the knowledge of the relationship between R{sub t} and S{sub w} in hydrocarbon sandstone reservoirs and to understand how the pore geometry affects the conductivity (n and m) of the rock a theoretical study was done. It was also an aim to examine the possibility for developing new S{sub w} - equations (or investigation an effective medium model) valid inhydrocarbon sandstone reservoirs. The results are presented in paper 1. A new equation for water saturation calculation in clean sandstone oil reservoirs is addressed in paper 2. A recommendation for best practice of water saturation calculation in non water wet formation is addressed in paper 3. Finally a new equation for water saturation calculation in thinly interbedded sandstone/mudstone reservoirs is presented in paper 4. The papers are titled: 1) Is the saturation exponent n a constant. 2) A New Model for Calculating Water Saturation In 3) Influence of wettability on water saturation modeling. 4) Water Saturation Calculations in Thinly Interbedded Sandstone/mudstone Reservoirs. A

  17. High Throughput Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Argonne?s high throughput facility provides highly automated and parallel approaches to material and materials chemistry development. The facility allows scientists...

  18. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  19. Femoral venous oxygen saturation is no surrogate for central venous oxygen saturation

    NARCIS (Netherlands)

    van Beest, Paul A.; van der Schors, Alice; Liefers, Henriette; Coenen, Ludo G. J.; Braam, Richard L.; Habib, Najib; Braber, Annemarije; Scheeren, Thomas W. L.; Kuiper, Michael A.; Spronk, Peter E.

    2012-01-01

    Objective:  The purpose of our study was to determine if central venous oxygen saturation and femoral venous oxygen saturation can be used interchangeably during surgery and in critically ill patients. Design:  Prospective observational controlled study. Setting:  Nonacademic university-affiliated

  20. High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.

    Directory of Open Access Journals (Sweden)

    Guangbo Liu

    Full Text Available Saccharomyces cerevisiae (budding yeast is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids or genome mutation (e.g., gene mutation, deletion, epitope tagging is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.

  1. Throughput capacity of the Asbestos Conversion Unit

    International Nuclear Information System (INIS)

    Hyman, M.H.

    1996-10-01

    An engineering assessment is presented for factors that could significantly limit the throughput capacity of the Asbestos Conversion Unit. The assessment focuses mainly on volumetric throughput capacity (and related mass rate and feed density), and energy input. Important conclusions that were reached during this assessment are that the throughput is limited by feed densification capability and that the design energy input rating appears to be adequate

  2. Chromatographic Monoliths for High-Throughput Immunoaffinity Isolation of Transferrin from Human Plasma

    Directory of Open Access Journals (Sweden)

    Irena Trbojević-Akmačić

    2016-06-01

    Full Text Available Changes in protein glycosylation are related to different diseases and have a potential as diagnostic and prognostic disease biomarkers. Transferrin (Tf glycosylation changes are common marker for congenital disorders of glycosylation. However, biological interindividual variability of Tf N-glycosylation and genes involved in glycosylation regulation are not known. Therefore, high-throughput Tf isolation method and large scale glycosylation studies are needed in order to address these questions. Due to their unique chromatographic properties, the use of chromatographic monoliths enables very fast analysis cycle, thus significantly increasing sample preparation throughput. Here, we are describing characterization of novel immunoaffinity-based monolithic columns in a 96-well plate format for specific high-throughput purification of human Tf from blood plasma. We optimized the isolation and glycan preparation procedure for subsequent ultra performance liquid chromatography (UPLC analysis of Tf N-glycosylation and managed to increase the sensitivity for approximately three times compared to initial experimental conditions, with very good reproducibility. This work is licensed under a Creative Commons Attribution 4.0 International License.

  3. Throughput assurance of wireless body area networks coexistence based on stochastic geometry.

    Directory of Open Access Journals (Sweden)

    Ruixia Liu

    Full Text Available Wireless body area networks (WBANs are expected to influence the traditional medical model by assisting caretakers with health telemonitoring. Within WBANs, the transmit power of the nodes should be as small as possible owing to their limited energy capacity but should be sufficiently large to guarantee the quality of the signal at the receiving nodes. When multiple WBANs coexist in a small area, the communication reliability and overall throughput can be seriously affected due to resource competition and interference. We show that the total network throughput largely depends on the WBANs distribution density (λp, transmit power of their nodes (Pt, and their carrier-sensing threshold (γ. Using stochastic geometry, a joint carrier-sensing threshold and power control strategy is proposed to meet the demand of coexisting WBANs based on the IEEE 802.15.4 standard. Given different network distributions and carrier-sensing thresholds, the proposed strategy derives a minimum transmit power according to varying surrounding environment. We obtain expressions for transmission success probability and throughput adopting this strategy. Using numerical examples, we show that joint carrier-sensing thresholds and transmit power strategy can effectively improve the overall system throughput and reduce interference. Additionally, this paper studies the effects of a guard zone on the throughput using a Matern hard-core point process (HCPP type II model. Theoretical analysis and simulation results show that the HCPP model can increase the success probability and throughput of networks.

  4. High-throughput technology for novel SO2 oxidation catalysts

    International Nuclear Information System (INIS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO 2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO 2 to SO 3 . High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO 2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO 2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO 3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. (topical review)

  5. TCP Throughput Profiles Using Measurements over Dedicated Connections

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Liu, Qiang [ORNL; Sen, Satyabrata [ORNL; Towsley, Don [University of Massachusetts, Amherst; Vardoyan, Gayane [University of Massachusetts, Amherst; Kettimuthu, R. [Argonne National Laboratory (ANL); Foster, Ian [University of Chicago

    2017-06-01

    Wide-area data transfers in high-performance computing infrastructures are increasingly being carried over dynamically provisioned dedicated network connections that provide high capacities with no competing traffic. We present extensive TCP throughput measurements and time traces over a suite of physical and emulated 10 Gbps connections with 0-366 ms round-trip times (RTTs). Contrary to the general expectation, they show significant statistical and temporal variations, in addition to the overall dependencies on the congestion control mechanism, buffer size, and the number of parallel streams. We analyze several throughput profiles that have highly desirable concave regions wherein the throughput decreases slowly with RTTs, in stark contrast to the convex profiles predicted by various TCP analytical models. We present a generic throughput model that abstracts the ramp-up and sustainment phases of TCP flows, which provides insights into qualitative trends observed in measurements across TCP variants: (i) slow-start followed by well-sustained throughput leads to concave regions; (ii) large buffers and multiple parallel streams expand the concave regions in addition to improving the throughput; and (iii) stable throughput dynamics, indicated by a smoother Poincare map and smaller Lyapunov exponents, lead to wider concave regions. These measurements and analytical results together enable us to select a TCP variant and its parameters for a given connection to achieve high throughput with statistical guarantees.

  6. An RNA-Based Fluorescent Biosensor for High-Throughput Analysis of the cGAS-cGAMP-STING Pathway.

    Science.gov (United States)

    Bose, Debojit; Su, Yichi; Marcus, Assaf; Raulet, David H; Hammond, Ming C

    2016-12-22

    In mammalian cells, the second messenger (2'-5',3'-5') cyclic guanosine monophosphate-adenosine monophosphate (2',3'-cGAMP), is produced by the cytosolic DNA sensor cGAMP synthase (cGAS), and subsequently bound by the stimulator of interferon genes (STING) to trigger interferon response. Thus, the cGAS-cGAMP-STING pathway plays a critical role in pathogen detection, as well as pathophysiological conditions including cancer and autoimmune disorders. However, studying and targeting this immune signaling pathway has been challenging due to the absence of tools for high-throughput analysis. We have engineered an RNA-based fluorescent biosensor that responds to 2',3'-cGAMP. The resulting "mix-and-go" cGAS activity assay shows excellent statistical reliability as a high-throughput screening (HTS) assay and distinguishes between direct and indirect cGAS inhibitors. Furthermore, the biosensor enables quantitation of 2',3'-cGAMP in mammalian cell lysates. We envision this biosensor-based assay as a resource to study the cGAS-cGAMP-STING pathway in the context of infectious diseases, cancer immunotherapy, and autoimmune diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. High-throughput analysis of endogenous fruit glycosyl hydrolases using a novel chromogenic hydrogel substrate assay

    DEFF Research Database (Denmark)

    Schückel, Julia; Kracun, Stjepan Kresimir; Lausen, Thomas Frederik

    2017-01-01

    A broad range of enzyme activities can be found in a wide range of different fruits and fruiting bodies but there is a lack of methods where many samples can be handled in a high-throughput and efficient manner. In particular, plant polysaccharide degrading enzymes – glycosyl hydrolases (GHs) play...... led to a more profound understanding of the importance of GH activity and regulation, current methods for determining glycosyl hydrolase activity are lacking in throughput and fail to keep up with data output from transcriptome research. Here we present the use of a versatile, easy...

  8. Third harmonic current injection into highly saturated multi-phase machines

    Directory of Open Access Journals (Sweden)

    Klute Felix

    2017-03-01

    Full Text Available One advantage of multi-phase machines is the possibility to use the third harmonic of the rotor flux for additional torque generation. This effect can be maximised for Permanent Magnet Synchronous Machines (PMSM with a high third harmonic content in the magnet flux. This paper discusses the effects of third harmonic current injection (THCI on a five-phase PMSM with a conventional magnet shape depending on saturation. The effects of THCI in five-phase machines are shown in a 2D FEM model in Ansys Maxwell verified by measurement results. The results of the FEM model are analytically analysed using the Park model. It is shown in simulation and measurement that the torque improvement by THCI increases significantly with the saturation level, as the amplitude of the third harmonic flux linkage increases with the saturation level but the phase shift of the rotor flux linkage has to be considered. This paper gives a detailed analysis of saturation mechanisms of PMSM, which can be used for optimizing the efficiency in operating points of high saturations, without using special magnet shapes.

  9. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  10. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  11. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  12. Management of High-Throughput DNA Sequencing Projects: Alpheus.

    Science.gov (United States)

    Miller, Neil A; Kingsmore, Stephen F; Farmer, Andrew; Langley, Raymond J; Mudge, Joann; Crow, John A; Gonzalez, Alvaro J; Schilkey, Faye D; Kim, Ryan J; van Velkinburgh, Jennifer; May, Gregory D; Black, C Forrest; Myers, M Kathy; Utsey, John P; Frost, Nicholas S; Sugarbaker, David J; Bueno, Raphael; Gullans, Stephen R; Baxter, Susan M; Day, Steve W; Retzel, Ernest F

    2008-12-26

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem's SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis.

  13. Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.

    Science.gov (United States)

    Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong

    2008-04-01

    The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.

  14. Speeding-up exchange-mediated saturation transfer experiments by Fourier transform

    Energy Technology Data Exchange (ETDEWEB)

    Carneiro, Marta G.; Reddy, Jithender G.; Griesinger, Christian; Lee, Donghan, E-mail: dole@nmr.mpibpc.mpg.de [Max-Planck Institute for Biophysical chemistry, Department of NMR-based Structural Biology (Germany)

    2015-11-15

    Protein motions over various time scales are crucial for protein function. NMR relaxation dispersion experiments play a key role in explaining these motions. However, the study of slow conformational changes with lowly populated states remained elusive. The recently developed exchange-mediated saturation transfer experiments allow the detection and characterization of such motions, but require extensive measurement time. Here we show that, by making use of Fourier transform, the total acquisition time required to measure an exchange-mediated saturation transfer profile can be reduced by twofold in case that one applies linear prediction. In addition, we demonstrate that the analytical solution for R{sub 1}ρ experiments can be used for fitting the exchange-mediated saturation transfer profile. Furthermore, we show that simultaneous analysis of exchange-mediated saturation transfer profiles with two different radio-frequency field strengths is required for accurate and precise characterization of the exchange process and the exchanging states.

  15. Speeding-up exchange-mediated saturation transfer experiments by Fourier transform

    International Nuclear Information System (INIS)

    Carneiro, Marta G.; Reddy, Jithender G.; Griesinger, Christian; Lee, Donghan

    2015-01-01

    Protein motions over various time scales are crucial for protein function. NMR relaxation dispersion experiments play a key role in explaining these motions. However, the study of slow conformational changes with lowly populated states remained elusive. The recently developed exchange-mediated saturation transfer experiments allow the detection and characterization of such motions, but require extensive measurement time. Here we show that, by making use of Fourier transform, the total acquisition time required to measure an exchange-mediated saturation transfer profile can be reduced by twofold in case that one applies linear prediction. In addition, we demonstrate that the analytical solution for R 1 ρ experiments can be used for fitting the exchange-mediated saturation transfer profile. Furthermore, we show that simultaneous analysis of exchange-mediated saturation transfer profiles with two different radio-frequency field strengths is required for accurate and precise characterization of the exchange process and the exchanging states

  16. Water saturation in shaly sands: logging parameters from log-derived values

    International Nuclear Information System (INIS)

    Miyairi, M.; Itoh, T.; Okabe, F.

    1976-01-01

    The methods are presented for determining the relation of porosity to formation factor and that of true resistivity of formation to water saturation, which were investigated through the log interpretation of one of the oil and gas fields of northern Japan Sea. The values of the coefficients ''a'' and ''m'' in porosity-formation factor relation are derived from cross-plot of porosity and resistivity of formation corrected by clay content. The saturation exponent ''n'' is determined from cross-plot of porosity and resistivity index on the assumption that the product of porosity and irreducible water saturation is constant. The relation of porosity to irreducible water saturation is also investigated from core analysis. The new logging parameters determined from the methods, a = 1, m = 2, n = 1.4, improved the values of water saturation by 6 percent in average, and made it easy to distinguish the points which belong to the productive zone and ones belonging to the nonproductive zone

  17. Repulsion-based model for contact angle saturation in electrowetting.

    Science.gov (United States)

    Ali, Hassan Abdelmoumen Abdellah; Mohamed, Hany Ahmed; Abdelgawad, Mohamed

    2015-01-01

    We introduce a new model for contact angle saturation phenomenon in electrowetting on dielectric systems. This new model attributes contact angle saturation to repulsion between trapped charges on the cap and base surfaces of the droplet in the vicinity of the three-phase contact line, which prevents these surfaces from converging during contact angle reduction. This repulsion-based saturation is similar to repulsion between charges accumulated on the surfaces of conducting droplets which causes the well known Coulombic fission and Taylor cone formation phenomena. In our model, both the droplet and dielectric coating were treated as lossy dielectric media (i.e., having finite electrical conductivities and permittivities) contrary to the more common assumption of a perfectly conducting droplet and perfectly insulating dielectric. We used theoretical analysis and numerical simulations to find actual charge distribution on droplet surface, calculate repulsion energy, and minimize energy of the total system as a function of droplet contact angle. Resulting saturation curves were in good agreement with previously reported experimental results. We used this proposed model to predict effect of changing liquid properties, such as electrical conductivity, and system parameters, such as thickness of the dielectric layer, on the saturation angle, which also matched experimental results.

  18. Comparison of scale analysis and numerical simulation for saturated zone convective mixing processes

    International Nuclear Information System (INIS)

    Oldenburg, C.M.

    1998-01-01

    Scale analysis can be used to predict a variety of quantities arising from natural systems where processes are described by partial differential equations. For example, scale analysis can be applied to estimate the effectiveness of convective missing on the dilution of contaminants in groundwater. Scale analysis involves substituting simple quotients for partial derivatives and identifying and equating the dominant terms in an order-of-magnitude sense. For free convection due to sidewall heating of saturated porous media, scale analysis shows that vertical convective velocity in the thermal boundary layer region is proportional to the Rayleigh number, horizontal convective velocity is proportional to the square root of the Rayleigh number, and thermal boundary layer thickness is proportional to the inverse square root of the Rayleigh number. These scale analysis estimates are corroborated by numerical simulations of an idealized system. A scale analysis estimate of mixing time for a tracer mixing by hydrodynamic dispersion in a convection cell also agrees well with numerical simulation for two different Rayleigh numbers. Scale analysis for the heating-from-below scenario produces estimates of maximum velocity one-half as large as the sidewall case. At small values of the Rayleigh number, this estimate is confirmed by numerical simulation. For larger Rayleigh numbers, simulation results suggest maximum velocities are similar to the sidewall heating scenario. In general, agreement between scale analysis estimates and numerical simulation results serves to validate the method of scale analysis. Application is to radioactive repositories

  19. Preliminary High-Throughput Metagenome Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  20. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    Science.gov (United States)

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  1. Effect of partial saturation of bonded neo magnet on the automotive accessory motor

    Directory of Open Access Journals (Sweden)

    Nimitkumar K. Sheth

    2017-05-01

    Full Text Available In this paper the effects of using a partially magnetized bonded neo (NdFeB magnet in an automotive accessory motor are presented. The potential reason for partial saturation of the bonded neo magnet is explained and a simple method to ensure saturation of the magnet is discussed. A magnetizing fixture design using the 2-D Finite element analysis (FEA is presented. The motor performance at various magnet saturation levels has been estimated using the 2-D FEA. Details of the thermal demagnetization test adopted by the automotive industry is also discussed and results of the motor performance for four saturation levels are detailed. These results indicate that the effect of demagnetization is more adverse in a motor with partially saturated magnets.

  2. Effect of partial saturation of bonded neo magnet on the automotive accessory motor

    Science.gov (United States)

    Sheth, Nimitkumar K.; Angara, Raghu C. S. Babu

    2017-05-01

    In this paper the effects of using a partially magnetized bonded neo (NdFeB) magnet in an automotive accessory motor are presented. The potential reason for partial saturation of the bonded neo magnet is explained and a simple method to ensure saturation of the magnet is discussed. A magnetizing fixture design using the 2-D Finite element analysis (FEA) is presented. The motor performance at various magnet saturation levels has been estimated using the 2-D FEA. Details of the thermal demagnetization test adopted by the automotive industry is also discussed and results of the motor performance for four saturation levels are detailed. These results indicate that the effect of demagnetization is more adverse in a motor with partially saturated magnets.

  3. Saturation analysis studies of corticosteroid levels in normal Greek subjects and in subjects with haemolytic diseases

    International Nuclear Information System (INIS)

    Vyzantiadis, A.

    1975-07-01

    Between 1970 and 1974 a saturation analysis for cortisol in plasma and free cortisol in urine, and a radioimmunoassay method for aldosterone in plasma and urine were developed. In order to permit a comparative evaluation it was necessary to study corticosteroids, diurnal rhythm and the probable effect of a siesta on this rhythm both in normal subjects and in patients suffering from hemic diseases, in particular from sickle-cell anemia. Saturation assay for cortisol, using serum from pregnant women as source of transcortin, and radioimmunoassay for aldosterone were the basic methods used. Serum cortisol was estimated twice a day (8-9 a.m. and 5-6 p.m.). Cortisol and aldosterone were also estimated in serum and in urine before and after adrenalin stimulation with ACTH. No significant influence of a siesta on the diurnal rhythm of cortisol was observed, nor did the levels of serum cortisol or the diurnal rhythm appear affected in congenital hemolytic anemias, following adrenalin stimulation. The report lists experimental results briefly and refers to a paper in which these are published in more detail

  4. ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)

    Science.gov (United States)

    US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...

  5. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    Science.gov (United States)

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  6. Saturated Zone Flow and Transport Expert Elicitation Project

    Energy Technology Data Exchange (ETDEWEB)

    Coppersmith, Kevin J.; Perman, Roseanne C.

    1998-01-01

    This report presents results of the Saturated Zone Flow and Transport Expert Elicitation (SZEE) project for Yucca Mountain, Nevada. This project was sponsored by the US Department of Energy (DOE) and managed by Geomatrix Consultants, Inc. (Geomatrix), for TRW Environmental Safety Systems, Inc. The DOE's Yucca Mountain Site Characterization Project (referred to as the YMP) is intended to evaluate the suitability of the site for construction of a mined geologic repository for the permanent disposal of spent nuclear fuel and high-level radioactive waste. The SZEE project is one of several that involve the elicitation of experts to characterize the knowledge and uncertainties regarding key inputs to the Yucca Mountain Total System Performance Assessment (TSPA). The objective of the current project was to characterize the uncertainties associated with certain key issues related to the saturated zone system in the Yucca Mountain area and downgradient region. An understanding of saturated zone processes is critical to evaluating the performance of the potential high-level nuclear waste repository at Yucca Mountain. A major goal of the project was to capture the uncertainties involved in assessing the saturated flow processes, including uncertainty in both the models used to represent the physical processes controlling saturated zone flow and transport, and the parameter values used in the models. So that the analysis included a wide range of perspectives, multiple individual judgments were elicited from members of an expert panel. The panel members, who were experts from within and outside the Yucca Mountain project, represented a range of experience and expertise. A deliberate process was followed in facilitating interactions among the experts, in training them to express their uncertainties, and in eliciting their interpretations. The resulting assessments and probability distributions, therefore, provide a reasonable aggregate representation of the knowledge and

  7. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  8. Fault tolerant control of systems with saturations

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2013-01-01

    This paper presents framework for fault tolerant controllers (FTC) that includes input saturation. The controller architecture known from FTC is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization is extended to handle input saturation. Applying this controller architecture in connec......This paper presents framework for fault tolerant controllers (FTC) that includes input saturation. The controller architecture known from FTC is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization is extended to handle input saturation. Applying this controller architecture...... in connection with faulty systems including input saturation gives an additional YJBK transfer function related to the input saturation. In the fault free case, this additional YJBK transfer function can be applied directly for optimizing the feedback loop around the input saturation. In the faulty case......, the design problem is a mixed design problem involved both parametric faults and input saturation....

  9. WormSizer: high-throughput analysis of nematode size and shape.

    Directory of Open Access Journals (Sweden)

    Brad T Moore

    Full Text Available The fundamental phenotypes of growth rate, size and morphology are the result of complex interactions between genotype and environment. We developed a high-throughput software application, WormSizer, which computes size and shape of nematodes from brightfield images. Existing methods for estimating volume either coarsely model the nematode as a cylinder or assume the worm shape or opacity is invariant. Our estimate is more robust to changes in morphology or optical density as it only assumes radial symmetry. This open source software is written as a plugin for the well-known image-processing framework Fiji/ImageJ. It may therefore be extended easily. We evaluated the technical performance of this framework, and we used it to analyze growth and shape of several canonical Caenorhabditis elegans mutants in a developmental time series. We confirm quantitatively that a Dumpy (Dpy mutant is short and fat and that a Long (Lon mutant is long and thin. We show that daf-2 insulin-like receptor mutants are larger than wild-type upon hatching but grow slow, and WormSizer can distinguish dauer larvae from normal larvae. We also show that a Small (Sma mutant is actually smaller than wild-type at all stages of larval development. WormSizer works with Uncoordinated (Unc and Roller (Rol mutants as well, indicating that it can be used with mutants despite behavioral phenotypes. We used our complete data set to perform a power analysis, giving users a sense of how many images are needed to detect different effect sizes. Our analysis confirms and extends on existing phenotypic characterization of well-characterized mutants, demonstrating the utility and robustness of WormSizer.

  10. High-throughput verification of transcriptional starting sites by Deep-RACE

    DEFF Research Database (Denmark)

    Olivarius, Signe; Plessy, Charles; Carninci, Piero

    2009-01-01

    We present a high-throughput method for investigating the transcriptional starting sites of genes of interest, which we named Deep-RACE (Deep–rapid amplification of cDNA ends). Taking advantage of the latest sequencing technology, it allows the parallel analysis of multiple genes and is free...

  11. A conifer-friendly high-throughput α-cellulose extraction method for δ13C and δ18O stable isotope ratio analysis

    Science.gov (United States)

    Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.

    2012-12-01

    Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.

  12. Investigating the Factors Affecting theZahedan’s Aquifer Hydrogeochemistry Using Foctor Analysis, Saturation Indices and Composite Diagrams’ Methods

    Directory of Open Access Journals (Sweden)

    J. Dowlati

    2014-12-01

    Full Text Available Zahedan aquifer is located in the northernof Zahedanwatedshed. It is essential to evaluate the quality of groundwater resources due to proving some part of drinking water, agricultural and industrial waters of this city. In order to carry out ground water quality monitoring, and assess the controlling possesses and determine cations and anions sources of the groundwater, 26 wells were sampled and water quality parameters were measured.The results of the analysis showed that almost all of the samples proved very saline and electrical conductivity varied from 1,359 to 12,620μS cm−1. In the Zahedan aquifer, sodium, chloride and sulfate were predominant Cation and Anions respectively, and sodium-chloride Na-Cl( and sodium - sulfateNa-So4 were dominant types of the groundwater. The factor analysis of samples results indicates that the two natural and human factors controlled about the 83/30% and 74/37% of the quality variations of the groundwater respectively in October and February. The first and major factor related to the natural processes of ion exchange and dissolution had a correlation with positive loadings of EC, Ca2+, Mg2+, Na+, Cl-, K+ and So42- and controls the 65.25% of the quality variations of the ground water in October and the 58.82% in February. The second factor related toCa2+, No3- constituted the18.05% of the quality variations in October and 15.56% in February, and given the urban development and less agricultural development in the aquifer, is dependent on human activities. For the samples collected in October, the saturation indices of calcite, gypsum and dolomite minerals showed saturated condition and calcite and dolomite in February showed saturated condition for more than 60% and 90% of samples and gypsum index revealed under-saturated condition for almost all samples.The unsaturated condition of Zahedan groundwater aquifer is resulted from the insufficient time for retaining water in the aquifer to dissolve the minerals

  13. Analysis of a microscale 'Saturation Phase-change Internal Carnot Engine'

    International Nuclear Information System (INIS)

    Lurie, Eli; Kribus, Abraham

    2010-01-01

    A micro heat engine, based on a cavity filled with a stationary working fluid under liquid-vapor saturation conditions and encapsulated by two membranes, is described and analyzed. This engine design is easy to produce using MEMS technologies and is operated with external heating and cooling. The motion of the membranes is controlled such that the internal pressure and temperature are constant during the heat addition and removal processes, and thus the fluid executes a true internal Carnot cycle. A model of this Saturation Phase-change Internal Carnot Engine (SPICE) was developed including thermodynamic, mechanical and heat transfer aspects. The efficiency and maximum power of the engine are derived. The maximum power point is fixed in a three-parameter space, and operation at this point leads to maximum power density that scales with the inverse square of the engine dimension. Inclusion of the finite heat capacity of the engine wall leads to a strong dependence of performance on engine frequency, and the existence of an optimal frequency. Effects of transient reverse heat flow, and 'parasitic heat' that does not participate in the thermodynamic cycle are observed.

  14. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  15. Error of image saturation in the structured-light method.

    Science.gov (United States)

    Qi, Zhaoshuai; Wang, Zhao; Huang, Junhui; Xing, Chao; Gao, Jianmin

    2018-01-01

    In the phase-measuring structured-light method, image saturation will induce large phase errors. Usually, by selecting proper system parameters (such as the phase-shift number, exposure time, projection intensity, etc.), the phase error can be reduced. However, due to lack of a complete theory of phase error, there is no rational principle or basis for the selection of the optimal system parameters. For this reason, the phase error due to image saturation is analyzed completely, and the effects of the two main factors, including the phase-shift number and saturation degree, on the phase error are studied in depth. In addition, the selection of optimal system parameters is discussed, including the proper range and the selection principle of the system parameters. The error analysis and the conclusion are verified by simulation and experiment results, and the conclusion can be used for optimal parameter selection in practice.

  16. High-throughput bioinformatics with the Cyrille2 pipeline system

    Directory of Open Access Journals (Sweden)

    de Groot Joost CW

    2008-02-01

    Full Text Available Abstract Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1 a web based, graphical user interface (GUI that enables a pipeline operator to manage the system; 2 the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3 the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines.

  17. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina Lundgaard; Login, Frédéric H.; Jensen, Helene Halkjær

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacteria...

  18. nitrogen saturation in stream ecosystems

    OpenAIRE

    Earl, S. R.; Valett, H. M.; Webster, J. R.

    2006-01-01

    The concept of nitrogen (N) saturation has organized the assessment of N loading in terrestrial ecosystems. Here we extend the concept to lotic ecosystems by coupling Michaelis-Menten kinetics and nutrient spiraling. We propose a series of saturation response types, which may be used to characterize the proximity of streams to N saturation. We conducted a series of short-term N releases using a tracer ((NO3)-N-15-N) to measure uptake. Experiments were conducted in streams spanning a gradient ...

  19. Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps

    Science.gov (United States)

    Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.

    2017-10-01

    The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.

  20. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  1. Micromechanics of non-active clays in saturated state and DEM modelling

    Directory of Open Access Journals (Sweden)

    Pagano Arianna Gea

    2017-01-01

    Full Text Available The paper presents a conceptual micromechanical model for 1-D compression behaviour of non-active clays in saturated state. An experimental investigation was carried out on kaolin clay samples saturated with fluids of different pH and dielectric permittivity. The effect of pore fluid characteristics on one-dimensional compressibility behaviour of kaolin was investigated. A three dimensional Discrete Element Method (DEM was implemented in order to simulate the response of saturated kaolin observed during the experiments. A complex contact model was introduced, considering both the mechanical and physico-chemical microscopic interactions between clay particles. A simple analysis with spherical particles only was performed as a preliminary step in the DEM study in the elastic regime.

  2. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  3. High throughput and accurate serum proteome profiling by integrated sample preparation technology and single-run data independent mass spectrometry analysis.

    Science.gov (United States)

    Lin, Lin; Zheng, Jiaxin; Yu, Quan; Chen, Wendong; Xing, Jinchun; Chen, Chenxi; Tian, Ruijun

    2018-03-01

    Mass spectrometry (MS)-based serum proteome analysis is extremely challenging due to its high complexity and dynamic range of protein abundances. Developing high throughput and accurate serum proteomic profiling approach capable of analyzing large cohorts is urgently needed for biomarker discovery. Herein, we report a streamlined workflow for fast and accurate proteomic profiling from 1μL of blood serum. The workflow combined an integrated technique for highly sensitive and reproducible sample preparation and a new data-independent acquisition (DIA)-based MS method. Comparing with standard data dependent acquisition (DDA) approach, the optimized DIA method doubled the number of detected peptides and proteins with better reproducibility. Without protein immunodepletion and prefractionation, the single-run DIA analysis enables quantitative profiling of over 300 proteins with 50min gradient time. The quantified proteins span more than five orders of magnitude of abundance range and contain over 50 FDA-approved disease markers. The workflow allowed us to analyze 20 serum samples per day, with about 358 protein groups per sample being identified. A proof-of-concept study on renal cell carcinoma (RCC) serum samples confirmed the feasibility of the workflow for large scale serum proteomic profiling and disease-related biomarker discovery. Blood serum or plasma is the predominant specimen for clinical proteomic studies while the analysis is extremely challenging for its high complexity. Many efforts had been made in the past for serum proteomics for maximizing protein identifications, whereas few have been concerned with throughput and reproducibility. Here, we establish a rapid, robust and high reproducible DIA-based workflow for streamlined serum proteomic profiling from 1μL serum. The workflow doesn't need protein depletion and pre-fractionation, while still being able to detect disease-relevant proteins accurately. The workflow is promising in clinical application

  4. Robust Throughput Boosting for Low Latency Dynamic Partial Reconfiguration

    DEFF Research Database (Denmark)

    Nannarelli, Alberto; Re, M.; Cardarilli, Gian Carlo

    2017-01-01

    Reducing the configuration time of portions of an FPGA at run time is crucial in contemporary FPGA-based accelerators. In this work, we propose a method to increase the throughput for FPGA dynamic partial reconfiguration by using standard IP blocks. The throughput is increased by over-clocking th......Reducing the configuration time of portions of an FPGA at run time is crucial in contemporary FPGA-based accelerators. In this work, we propose a method to increase the throughput for FPGA dynamic partial reconfiguration by using standard IP blocks. The throughput is increased by over...

  5. PFP total process throughput calculation and basis of estimate

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Process Throughput Calculation and Basis of Estimate document provides the calculated value and basis of estimate for process throughput associated with material stabilization operations conducted in 234-52 Building. The process throughput data provided reflects the best estimates of material processing rates consistent with experience at the Plutonium Finishing Plant (PFP) and other U.S. Department of Energy (DOE) sites. The rates shown reflect demonstrated capacity during ''full'' operation. They do not reflect impacts of building down time. Therefore, these throughput rates need to have a Total Operating Efficiency (TOE) factor applied

  6. Infra-red thermography for high throughput field phenotyping in Solanum tuberosum.

    Directory of Open Access Journals (Sweden)

    Ankush Prashar

    Full Text Available The rapid development of genomic technology has made high throughput genotyping widely accessible but the associated high throughput phenotyping is now the major limiting factor in genetic analysis of traits. This paper evaluates the use of thermal imaging for the high throughput field phenotyping of Solanum tuberosum for differences in stomatal behaviour. A large multi-replicated trial of a potato mapping population was used to investigate the consistency in genotypic rankings across different trials and across measurements made at different times of day and on different days. The results confirmed a high degree of consistency between the genotypic rankings based on relative canopy temperature on different occasions. Genotype discrimination was enhanced both through normalising data by expressing genotype temperatures as differences from image means and through the enhanced replication obtained by using overlapping images. A Monte Carlo simulation approach was used to confirm the magnitude of genotypic differences that it is possible to discriminate. The results showed a clear negative association between canopy temperature and final tuber yield for this population, when grown under ample moisture supply. We have therefore established infrared thermography as an easy, rapid and non-destructive screening method for evaluating large population trials for genetic analysis. We also envisage this approach as having great potential for evaluating plant response to stress under field conditions.

  7. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  8. Semiconductor saturable absorbers for ultrafast terahertz signals

    DEFF Research Database (Denmark)

    Hoffmann, Matthias C.; Turchinovich, Dmitry

    2010-01-01

    states, due to conduction band onparabolicity and scattering into satellite valleys in strong THz fields. Saturable absorber parameters, such as linear and nonsaturable transmission, and saturation fluence, are extracted by fits to a classic saturable absorber model. Further, we observe THz pulse......We demonstrate saturable absorber behavior of n-type semiconductors GaAs, GaP, and Ge in the terahertz THz frequency range at room temperature using nonlinear THz spectroscopy. The saturation mechanism is based on a decrease in electron conductivity of semiconductors at high electron momentum...

  9. The danish tax on saturated fat

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Smed, Sinne

    Denmark introduced a new tax on saturated fat in food products with effect from October 2011. The objective of this paper is to make an effect assessment of this tax for some of the product categories most significantly affected by the new tax, namely fats such as butter, butter-blends, margarine...... on saturated fat in food products has had some effects on the market for the considered products, in that the level of consumption of fats dropped by 10 – 20%. Furthermore, the analysis points at shifts in demand from high-price supermarkets towards low-price discount stores – a shift that seems to have been...... utilized by discount chains to raise the prices of butter and margarine by more than the pure tax increase. Due to the relatively short data period with the tax being active, interpretation of these findings from a long-run perspective should be done with considerable care. It is thus recommended to repeat...

  10. Application of high-throughput sequencing in understanding human oral microbiome related with health and disease

    OpenAIRE

    Chen, Hui; Jiang, Wen

    2014-01-01

    The oral microbiome is one of most diversity habitat in the human body and they are closely related with oral health and disease. As the technique developing,, high throughput sequencing has become a popular approach applied for oral microbial analysis. Oral bacterial profiles have been studied to explore the relationship between microbial diversity and oral diseases such as caries and periodontal disease. This review describes the application of high-throughput sequencing for characterizati...

  11. Nitrogen saturation in stream ecosystems.

    Science.gov (United States)

    Earl, Stevan R; Valett, H Maurice; Webster, Jackson R

    2006-12-01

    The concept of nitrogen (N) saturation has organized the assessment of N loading in terrestrial ecosystems. Here we extend the concept to lotic ecosystems by coupling Michaelis-Menten kinetics and nutrient spiraling. We propose a series of saturation response types, which may be used to characterize the proximity of streams to N saturation. We conducted a series of short-term N releases using a tracer (15NO3-N) to measure uptake. Experiments were conducted in streams spanning a gradient of background N concentration. Uptake increased in four of six streams as NO3-N was incrementally elevated, indicating that these streams were not saturated. Uptake generally corresponded to Michaelis-Menten kinetics but deviated from the model in two streams where some other growth-critical factor may have been limiting. Proximity to saturation was correlated to background N concentration but was better predicted by the ratio of dissolved inorganic N (DIN) to soluble reactive phosphorus (SRP), suggesting phosphorus limitation in several high-N streams. Uptake velocity, a reflection of uptake efficiency, declined nonlinearly with increasing N amendment in all streams. At the same time, uptake velocity was highest in the low-N streams. Our conceptual model of N transport, uptake, and uptake efficiency suggests that, while streams may be active sites of N uptake on the landscape, N saturation contributes to nonlinear changes in stream N dynamics that correspond to decreased uptake efficiency.

  12. Intelligent Approach for Analysis of Respiratory Signals and Oxygen Saturation in the Sleep Apnea/Hypopnea Syndrome

    Science.gov (United States)

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712

  13. Brain oxygen saturation assessment in neonates using T2-prepared blood imaging of oxygen saturation and near-infrared spectroscopy

    DEFF Research Database (Denmark)

    Alderliesten, Thomas; De Vis, Jill B; Lemmers, Petra Ma

    2017-01-01

    saturation in the sagittal sinus (R(2 )= 0.49, p = 0.023), but no significant correlations could be demonstrated with frontal and whole brain cerebral blood flow. These results suggest that measuring oxygen saturation by T2-prepared blood imaging of oxygen saturation is feasible, even in neonates. Strong...... sinus. A strong linear relation was found between the oxygen saturation measured by magnetic resonance imaging and the oxygen saturation measured by near-infrared spectroscopy (R(2 )= 0.64, p ..., and magnetic resonance imaging measures of frontal cerebral blood flow, whole brain cerebral blood flow and venous oxygen saturation in the sagittal sinus (R(2 )= 0.71, 0.50, 0.65; p 

  14. New knowledge on the temperature-entropy saturation boundary slope of working fluids

    International Nuclear Information System (INIS)

    Su, Wen; Zhao, Li; Deng, Shuai

    2017-01-01

    The slope of temperature-entropy saturation boundary of working fluids has a significant effect on the thermodynamic performance of cycle processes. However, for the working fluids used in cycles, few studies have been conducted to analyze the saturated slope from the molecular structure and mixture composition. Thus, in this contribution, an analytical expression on the slope of saturated curve is obtained based on the highly accurate Helmholtz energy equation. 14 pure working fluids and three typical binary mixtures are employed to analyze the influence of molecular groups and mixture compositions on the saturated slope, according to the correlated parameters of Helmholtz energy equation. Based on the calculated results, a preliminary trend is demonstrated that with an increase of the number of molecular groups, the positive liquid slope of pure fluids increases and the vapor slope appears positive sign in a narrow temperature range. Particularly, for the binary mixtures, the liquid slope is generally located between the corresponding pure fluids', while the vapor slope can be infinity by mixing dry and wet fluids ingeniously. It can be proved through the analysis of mixtures' saturated slope that three types of vapor slope could be obtained by regulating the mixture composition. - Highlights: • The saturated slope is derived from the Helmholtz function for working fluids. • The effect of molecular structure on the saturated slope is analyzed. • The variation of saturated slope with the mixture composition is investigated.

  15. Saturation at Low X and Nonlinear Evolution

    International Nuclear Information System (INIS)

    Stasto, A.M.

    2002-01-01

    In this talk the results of the analytical and numerical analysis of the nonlinear Balitsky-Kovchegov equation are presented. The characteristic BFKL diffusion into infrared regime is suppressed by the generation of the saturation scale Q s . We identify the scaling and linear regimes for the solution. We also study the impact of subleading corrections onto the nonlinear evolution. (author)

  16. Palm Oil Consumption Increases LDL Cholesterol Compared with Vegetable Oils Low in Saturated Fat in a Meta-Analysis of Clinical Trials.

    Science.gov (United States)

    Sun, Ye; Neelakantan, Nithya; Wu, Yi; Lote-Oke, Rashmi; Pan, An; van Dam, Rob M

    2015-07-01

    Palm oil contains a high amount of saturated fat compared with most other vegetable oils, but studies have reported inconsistent effects of palm oil on blood lipids. We systematically reviewed the effect of palm oil consumption on blood lipids compared with other cooking oils using data from clinical trials. We searched PubMed and the Cochrane Library for trials of at least 2 wk duration that compared the effects of palm oil consumption with any of the predefined comparison oils: vegetable oils low in saturated fat, trans fat-containing partially hydrogenated vegetable oils, and animal fats. Data were pooled by using random-effects meta-analysis. Palm oil significantly increased LDL cholesterol by 0.24 mmol/L (95% CI: 0.13, 0.35 mmol/L; I(2) = 83.2%) compared with vegetable oils low in saturated fat. This effect was observed in randomized trials (0.31 mmol/L; 95% CI: 0.20, 0.42 mmol/L) but not in nonrandomized trials (0.03 mmol/L; 95% CI: -0.15, 0.20 mmol/L; P-difference = 0.02). Among randomized trials, only modest heterogeneity in study results remained after considering the test oil dose and the comparison oil type (I(2) = 27.5%). Palm oil increased HDL cholesterol by 0.02 mmol/L (95% CI: 0.01, 0.04 mmol/L; I(2) = 49.8%) compared with vegetable oils low in saturated fat and by 0.09 mmol/L (95% CI: 0.06, 0.11 mmol/L; I(2) = 47.8%) compared with trans fat-containing oils. Palm oil consumption results in higher LDL cholesterol than do vegetable oils low in saturated fat and higher HDL cholesterol than do trans fat-containing oils in humans. The effects of palm oil on blood lipids are as expected on the basis of its high saturated fat content, which supports the reduction in palm oil use by replacement with vegetable oils low in saturated and trans fat. This systematic review was registered with the PROSPERO registry at http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42012002601#.VU3wvSGeDRZ as CRD42012002601. © 2015 American Society for Nutrition.

  17. Gold nanoparticle-mediated (GNOME) laser perforation: a new method for a high-throughput analysis of gap junction intercellular coupling.

    Science.gov (United States)

    Begandt, Daniela; Bader, Almke; Antonopoulos, Georgios C; Schomaker, Markus; Kalies, Stefan; Meyer, Heiko; Ripken, Tammo; Ngezahayo, Anaclet

    2015-10-01

    The present report evaluates the advantages of using the gold nanoparticle-mediated laser perforation (GNOME LP) technique as a computer-controlled cell optoperforation to introduce Lucifer yellow (LY) into cells in order to analyze the gap junction coupling in cell monolayers. To permeabilize GM-7373 endothelial cells grown in a 24 multiwell plate with GNOME LP, a laser beam of 88 μm in diameter was applied in the presence of gold nanoparticles and LY. After 10 min to allow dye uptake and diffusion through gap junctions, we observed a LY-positive cell band of 179 ± 8 μm width. The presence of the gap junction channel blocker carbenoxolone during the optoperforation reduced the LY-positive band to 95 ± 6 μm. Additionally, a forskolin-related enhancement of gap junction coupling, recently found using the scrape loading technique, was also observed using GNOME LP. Further, an automatic cell imaging and a subsequent semi-automatic quantification of the images using a java-based ImageJ-plugin were performed in a high-throughput sequence. Moreover, the GNOME LP was used on cells such as RBE4 rat brain endothelial cells, which cannot be mechanically scraped as well as on three-dimensionally cultivated cells, opening the possibility to implement the GNOME LP technique for analysis of gap junction coupling in tissues. We conclude that the GNOME LP technique allows a high-throughput automated analysis of gap junction coupling in cells. Moreover this non-invasive technique could be used on monolayers that do not support mechanical scraping as well as on cells in tissue allowing an in vivo/ex vivo analysis of gap junction coupling.

  18. High-throughput quantitative biochemical characterization of algal biomass by NIR spectroscopy; multiple linear regression and multivariate linear regression analysis.

    Science.gov (United States)

    Laurens, L M L; Wolfrum, E J

    2013-12-18

    One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.

  19. Creation of a small high-throughput screening facility.

    Science.gov (United States)

    Flak, Tod

    2009-01-01

    The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.

  20. Generalized Encoding CRDSA: Maximizing Throughput in Enhanced Random Access Schemes for Satellite

    Directory of Open Access Journals (Sweden)

    Manlio Bacco

    2014-12-01

    Full Text Available This work starts from the analysis of the literature about the Random Access protocols with contention resolution, such as Contention Resolution Diversity Slotted Aloha (CRDSA, and introduces a possible enhancement, named Generalized Encoding Contention Resolution Diversity Slotted Aloha (GE-CRDSA. The GE-CRDSA aims at improving the aggregated throughput when the system load is less than 50%, playing on the opportunity of transmitting an optimal combination of information and parity packets frame by frame. This paper shows the improvement in terms of throughput, by performing traffic estimation and adaptive choice of information and parity rates, when a satellite network undergoes a variable traffic load profile.

  1. Gas hydrate saturations estimated from fractured reservoir at Site NGHP-01-10, Krishna-Godavari Basin, India

    Science.gov (United States)

    Lee, M.W.; Collett, T.S.

    2009-01-01

    During the Indian National Gas Hydrate Program Expedition 01 (NGHP-Ol), one of the richest marine gas hydrate accumulations was discovered at Site NGHP-01-10 in the Krishna-Godavari Basin. The occurrence of concentrated gas hydrate at this site is primarily controlled by the presence of fractures. Assuming the resistivity of gas hydratebearing sediments is isotropic, th?? conventional Archie analysis using the logging while drilling resistivity log yields gas hydrate saturations greater than 50% (as high as ???80%) of the pore space for the depth interval between ???25 and ???160 m below seafloor. On the other hand, gas hydrate saturations estimated from pressure cores from nearby wells were less than ???26% of the pore space. Although intrasite variability may contribute to the difference, the primary cause of the saturation difference is attributed to the anisotropic nature of the reservoir due to gas hydrate in high-angle fractures. Archie's law can be used to estimate gas hydrate saturations in anisotropic reservoir, with additional information such as elastic velocities to constrain Archie cementation parameters m and the saturation exponent n. Theory indicates that m and n depend on the direction of the measurement relative to fracture orientation, as well as depending on gas hydrate saturation. By using higher values of m and n in the resistivity analysis for fractured reservoirs, the difference between saturation estimates is significantly reduced, although a sizable difference remains. To better understand the nature of fractured reservoirs, wireline P and S wave velocities were also incorporated into the analysis.

  2. Accuracy in the quantification of chemical exchange saturation transfer (CEST) and relayed nuclear Overhauser enhancement (rNOE) saturation transfer effects.

    Science.gov (United States)

    Zhang, Xiao-Yong; Wang, Feng; Li, Hua; Xu, Junzhong; Gochberg, Daniel F; Gore, John C; Zu, Zhongliang

    2017-07-01

    Accurate quantification of chemical exchange saturation transfer (CEST) effects, including dipole-dipole mediated relayed nuclear Overhauser enhancement (rNOE) saturation transfer, is important for applications and studies of molecular concentration and transfer rate (and thereby pH or temperature). Although several quantification methods, such as Lorentzian difference (LD) analysis, multiple-pool Lorentzian fits, and the three-point method, have been extensively used in several preclinical and clinical applications, the accuracy of these methods has not been evaluated. Here we simulated multiple-pool Z spectra containing the pools that contribute to the main CEST and rNOE saturation transfer signals in the brain, numerically fit them using the different methods, and then compared their derived CEST metrics with the known solute concentrations and exchange rates. Our results show that the LD analysis overestimates contributions from amide proton transfer (APT) and intermediate exchanging amine protons; the three-point method significantly underestimates both APT and rNOE saturation transfer at -3.5 ppm (NOE(-3.5)). The multiple-pool Lorentzian fit is more accurate than the other two methods, but only at lower irradiation powers (≤1 μT at 9.4 T) within the range of our simulations. At higher irradiation powers, this method is also inaccurate because of the presence of a fast exchanging CEST signal that has a non-Lorentzian lineshape. Quantitative parameters derived from in vivo images of rodent brain tumor obtained using an irradiation power of 1 μT were also compared. Our results demonstrate that all three quantification methods show similar contrasts between tumor and contralateral normal tissue for both APT and the NOE(-3.5). However, the quantified values of the three methods are significantly different. Our work provides insight into the fitting accuracy obtainable in a complex tissue model and provides guidelines for evaluating other newly developed

  3. The simple fool's guide to population genomics via RNA-Seq: An introduction to high-throughput sequencing data analysis

    DEFF Research Database (Denmark)

    De Wit, P.; Pespeni, M.H.; Ladner, J.T.

    2012-01-01

    to Population Genomics via RNA-seq' (SFG), a document intended to serve as an easy-to-follow protocol, walking a user through one example of high-throughput sequencing data analysis of nonmodel organisms. It is by no means an exhaustive protocol, but rather serves as an introduction to the bioinformatic methods...... used in population genomics, enabling a user to gain familiarity with basic analysis steps. The SFG consists of two parts. This document summarizes the steps needed and lays out the basic themes for each and a simple approach to follow. The second document is the full SFG, publicly available at http://sfg.......stanford.edu, that includes detailed protocols for data processing and analysis, along with a repository of custom-made scripts and sample files. Steps included in the SFG range from tissue collection to de novo assembly, blast annotation, alignment, gene expression, functional enrichment, SNP detection, principal components...

  4. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  5. Application of ToxCast High-Throughput Screening and ...

    Science.gov (United States)

    Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenesis Distruptors Slide presentation at the SETAC annual meeting on High-Throughput Screening and Modeling Approaches to Identify Steroidogenssis Distruptors

  6. Relating oxygen partial pressure, saturation and content: the haemoglobin-oxygen dissociation curve.

    Science.gov (United States)

    Collins, Julie-Ann; Rudenski, Aram; Gibson, John; Howard, Luke; O'Driscoll, Ronan

    2015-09-01

    The delivery of oxygen by arterial blood to the tissues of the body has a number of critical determinants including blood oxygen concentration (content), saturation (S O2 ) and partial pressure, haemoglobin concentration and cardiac output, including its distribution. The haemoglobin-oxygen dissociation curve, a graphical representation of the relationship between oxygen satur-ation and oxygen partial pressure helps us to understand some of the principles underpinning this process. Historically this curve was derived from very limited data based on blood samples from small numbers of healthy subjects which were manipulated in vitro and ultimately determined by equations such as those described by Severinghaus in 1979. In a study of 3524 clinical specimens, we found that this equation estimated the S O2 in blood from patients with normal pH and S O2 >70% with remarkable accuracy and, to our knowledge, this is the first large-scale validation of this equation using clinical samples. Oxygen saturation by pulse oximetry (S pO2 ) is nowadays the standard clinical method for assessing arterial oxygen saturation, providing a convenient, pain-free means of continuously assessing oxygenation, provided the interpreting clinician is aware of important limitations. The use of pulse oximetry reduces the need for arterial blood gas analysis (S aO2 ) as many patients who are not at risk of hypercapnic respiratory failure or metabolic acidosis and have acceptable S pO2 do not necessarily require blood gas analysis. While arterial sampling remains the gold-standard method of assessing ventilation and oxygenation, in those patients in whom blood gas analysis is indicated, arterialised capillary samples also have a valuable role in patient care. The clinical role of venous blood gases however remains less well defined.

  7. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection.

  8. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    International Nuclear Information System (INIS)

    Hui Su

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, we introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, we demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm(sub 2) for 40-(micro)m wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection

  9. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    International Nuclear Information System (INIS)

    K. Rehfeldt

    2004-01-01

    This report is an updated analysis of water-level data performed to provide the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]) (referred to as the saturated zone (SZ) site-scale flow model or site-scale SZ flow model in this report) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for calibration of groundwater flow models. This report also contains an expanded discussion of uncertainty in the potentiometric-surface map. The analysis of the potentiometric data presented in Revision 00 of this report (USGS 2001 [DIRS 154625]) provides the configuration of the potentiometric surface, target heads, and hydraulic gradients for the calibration of the SZ site-scale flow model (BSC 2004 [DIRS 170037]). Revision 01 of this report (USGS 2004 [DIRS 168473]) used updated water-level data for selected wells through the year 2000 as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain based on an alternative interpretation of perched water conditions. That revision developed computer files containing: Water-level data within the model area (DTN: GS010908312332.002); A table of known vertical head differences (DTN: GS010908312332.003); and A potentiometric-surface map (DTN: GS010608312332.001) using an alternative concept from that presented by USGS (2001 [DIRS 154625]) for the area north of Yucca Mountain. The updated water-level data presented in USGS (2004 [DIRS 168473]) include data obtained from the Nye County Early Warning Drilling Program (EWDP) Phases I and II and data from Borehole USW WT-24. This document is based on Revision 01 (USGS 2004 [DIRS 168473]) and expands the discussion of uncertainty in the potentiometric-surface map. This uncertainty assessment includes an analysis of the impact of more recent water-level data and the impact of adding data from the EWDP Phases III and IV wells. In addition to being utilized

  10. Assessing species saturation: conceptual and methodological challenges.

    Science.gov (United States)

    Olivares, Ingrid; Karger, Dirk N; Kessler, Michael

    2018-05-07

    Is there a maximum number of species that can coexist? Intuitively, we assume an upper limit to the number of species in a given assemblage, or that a lineage can produce, but defining and testing this limit has proven problematic. Herein, we first outline seven general challenges of studies on species saturation, most of which are independent of the actual method used to assess saturation. Among these are the challenge of defining saturation conceptually and operationally, the importance of setting an appropriate referential system, and the need to discriminate among patterns, processes and mechanisms. Second, we list and discuss the methodological approaches that have been used to study species saturation. These approaches vary in time and spatial scales, and in the variables and assumptions needed to assess saturation. We argue that assessing species saturation is possible, but that many studies conducted to date have conceptual and methodological flaws that prevent us from currently attaining a good idea of the occurrence of species saturation. © 2018 Cambridge Philosophical Society.

  11. High throughput imaging cytometer with acoustic focussing.

    Science.gov (United States)

    Zmijan, Robert; Jonnalagadda, Umesh S; Carugo, Dario; Kochi, Yu; Lemm, Elizabeth; Packham, Graham; Hill, Martyn; Glynne-Jones, Peter

    2015-10-31

    We demonstrate an imaging flow cytometer that uses acoustic levitation to assemble cells and other particles into a sheet structure. This technique enables a high resolution, low noise CMOS camera to capture images of thousands of cells with each frame. While ultrasonic focussing has previously been demonstrated for 1D cytometry systems, extending the technology to a planar, much higher throughput format and integrating imaging is non-trivial, and represents a significant jump forward in capability, leading to diagnostic possibilities not achievable with current systems. A galvo mirror is used to track the images of the moving cells permitting exposure times of 10 ms at frame rates of 50 fps with motion blur of only a few pixels. At 80 fps, we demonstrate a throughput of 208 000 beads per second. We investigate the factors affecting motion blur and throughput, and demonstrate the system with fluorescent beads, leukaemia cells and a chondrocyte cell line. Cells require more time to reach the acoustic focus than beads, resulting in lower throughputs; however a longer device would remove this constraint.

  12. Landsliding in partially saturated materials

    Science.gov (United States)

    Godt, J.W.; Baum, R.L.; Lu, N.

    2009-01-01

    [1] Rainfall-induced landslides are pervasive in hillslope environments around the world and among the most costly and deadly natural hazards. However, capturing their occurrence with scientific instrumentation in a natural setting is extremely rare. The prevailing thinking on landslide initiation, particularly for those landslides that occur under intense precipitation, is that the failure surface is saturated and has positive pore-water pressures acting on it. Most analytic methods used for landslide hazard assessment are based on the above perception and assume that the failure surface is located beneath a water table. By monitoring the pore water and soil suction response to rainfall, we observed shallow landslide occurrence under partially saturated conditions for the first time in a natural setting. We show that the partially saturated shallow landslide at this site is predictable using measured soil suction and water content and a novel unified effective stress concept for partially saturated earth materials. Copyright 2009 by the American Geophysical Union.

  13. Recipe for residual oil saturation determination

    Energy Technology Data Exchange (ETDEWEB)

    Guillory, A.J.; Kidwell, C.M.

    1979-01-01

    In 1978, Shell Oil Co., in conjunction with the US Department of Energy, conducted a residual oil saturation study in a deep, hot high-pressured Gulf Coast Reservoir. The work was conducted prior to initiation of CO/sub 2/ tertiary recovery pilot. Many problems had to be resolved prior to and during the residual oil saturation determination. The problems confronted are outlined such that the procedure can be used much like a cookbook in designing future studies in similar reservoirs. Primary discussion centers around planning and results of a log-inject-log operation used as a prime method to determine the residual oil saturation. Several independent methods were used to calculate the residual oil saturation in the subject well in an interval between 12,910 ft (3935 m) and 12,020 ft (3938 m). In general, these numbers were in good agreement and indicated a residual oil saturation between 22% and 24%. 10 references.

  14. Saturation behavior: a general relationship described by a simple second-order differential equation.

    Science.gov (United States)

    Kepner, Gordon R

    2010-04-13

    The numerous natural phenomena that exhibit saturation behavior, e.g., ligand binding and enzyme kinetics, have been approached, to date, via empirical and particular analyses. This paper presents a mechanism-free, and assumption-free, second-order differential equation, designed only to describe a typical relationship between the variables governing these phenomena. It develops a mathematical model for this relation, based solely on the analysis of the typical experimental data plot and its saturation characteristics. Its utility complements the traditional empirical approaches. For the general saturation curve, described in terms of its independent (x) and dependent (y) variables, a second-order differential equation is obtained that applies to any saturation phenomena. It shows that the driving factor for the basic saturation behavior is the probability of the interactive site being free, which is described quantitatively. Solving the equation relates the variables in terms of the two empirical constants common to all these phenomena, the initial slope of the data plot and the limiting value at saturation. A first-order differential equation for the slope emerged that led to the concept of the effective binding rate at the active site and its dependence on the calculable probability the interactive site is free. These results are illustrated using specific cases, including ligand binding and enzyme kinetics. This leads to a revised understanding of how to interpret the empirical constants, in terms of the variables pertinent to the phenomenon under study. The second-order differential equation revealed the basic underlying relations that describe these saturation phenomena, and the basic mathematical properties of the standard experimental data plot. It was shown how to integrate this differential equation, and define the common basic properties of these phenomena. The results regarding the importance of the slope and the new perspectives on the empirical

  15. Tracking Controller for Intrinsic Output Saturated Systems in Presence of Amplitude and Rate Input Saturations

    DEFF Research Database (Denmark)

    Chater, E.; Giri, F.; Guerrero, Josep M.

    2014-01-01

    We consider the problem of controlling plants that are subject to multiple saturation constraints. Especially, we are interested in linear systems whose input is subject to amplitude and rate constraints of saturation type. Furthermore, the considered systems output is also subject to an intrinsi...

  16. Impact of sample saturation on the detected porosity of hardened concrete using low temperature calorimetry

    DEFF Research Database (Denmark)

    Wu, Min; Johannesson, Björn

    2014-01-01

    The present work studied the impact of sample saturation on the analysis of pore volume and pore size distribution by low temperature (micro-)calorimetry. The theoretical background was examined, which emphasizes that the freezing/melting temperature of water/ice confined in non-fully saturated p...

  17. Systematic instrumental errors between oxygen saturation analysers in fetal blood during deep hypoxemia.

    Science.gov (United States)

    Porath, M; Sinha, P; Dudenhausen, J W; Luttkus, A K

    2001-05-01

    During a study of artificially produced deep hypoxemia in fetal cord blood, systematic errors of three different oxygen saturation analysers were evaluated against a reference CO oximeter. The oxygen tensions (PO2) of 83 pre-heparinized fetal blood samples from umbilical veins were reduced by tonometry to 1.3 kPa (10 mm Hg) and 2.7 kPa (20 mm Hg). The oxygen saturation (SO2) was determined (n=1328) on a reference CO oximeter (ABL625, Radiometer Copenhagen) and on three tested instruments (two CO oximeters: Chiron865, Bayer Diagnostics; ABL700, Radiometer Copenhagen, and a portable blood gas analyser, i-STAT, Abbott). The CO oximeters measure the oxyhemoglobin and the reduced hemoglobin fractions by absorption spectrophotometry. The i-STAT system calculates the oxygen saturation from the measured pH, PO2, and PCO2. The measurements were performed in duplicate. Statistical evaluation focused on the differences between duplicate measurements and on systematic instrumental errors in oxygen saturation analysis compared to the reference CO oximeter. After tonometry, the median saturation dropped to 32.9% at a PO2=2.7 kPa (20 mm Hg), defined as saturation range 1, and to 10% SO2 at a PO2=1.3 kPa (10 mm Hg), defined as range 2. With decreasing SO2, all devices showed an increased difference between duplicate measurements. ABL625 and ABL700 showed the closest agreement between instruments (0.25% SO2 bias at saturation range 1 and -0.33% SO2 bias at saturation range 2). Chiron865 indicated higher saturation values than ABL 625 (3.07% SO2 bias at saturation range 1 and 2.28% SO2 bias at saturation range 2). Calculated saturation values (i-STAT) were more than 30% lower than the measured values of ABL625. The disagreement among CO oximeters was small but increasing under deep hypoxemia. Calculation found unacceptably low saturation.

  18. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  19. Effects of Perfluorooctanoic Acid on Metabolic Profiles in Brain and Liver of Mouse Revealed by a High-throughput Targeted Metabolomics Approach

    Science.gov (United States)

    Yu, Nanyang; Wei, Si; Li, Meiying; Yang, Jingping; Li, Kan; Jin, Ling; Xie, Yuwei; Giesy, John P.; Zhang, Xiaowei; Yu, Hongxia

    2016-04-01

    Perfluorooctanoic acid (PFOA), a perfluoroalkyl acid, can result in hepatotoxicity and neurobehavioral effects in animals. The metabolome, which serves as a connection among transcriptome, proteome and toxic effects, provides pathway-based insights into effects of PFOA. Since understanding of changes in the metabolic profile during hepatotoxicity and neurotoxicity were still incomplete, a high-throughput targeted metabolomics approach (278 metabolites) was used to investigate effects of exposure to PFOA for 28 d on brain and liver of male Balb/c mice. Results of multivariate statistical analysis indicated that PFOA caused alterations in metabolic pathways in exposed individuals. Pathway analysis suggested that PFOA affected metabolism of amino acids, lipids, carbohydrates and energetics. Ten and 18 metabolites were identified as potential unique biomarkers of exposure to PFOA in brain and liver, respectively. In brain, PFOA affected concentrations of neurotransmitters, including serotonin, dopamine, norepinephrine, and glutamate in brain, which provides novel insights into mechanisms of PFOA-induced neurobehavioral effects. In liver, profiles of lipids revealed involvement of β-oxidation and biosynthesis of saturated and unsaturated fatty acids in PFOA-induced hepatotoxicity, while alterations in metabolism of arachidonic acid suggesting potential of PFOA to cause inflammation response in liver. These results provide insight into the mechanism and biomarkers for PFOA-induced effects.

  20. Inversion degree and saturation magnetization of different nanocrystalline cobalt ferrites

    International Nuclear Information System (INIS)

    Concas, G.; Spano, G.; Cannas, C.; Musinu, A.; Peddis, D.; Piccaluga, G.

    2009-01-01

    The inversion degree of a series of nanocrystalline samples of CoFe 2 O 4 ferrites has been evaluated by a combined study, which exploits the saturation magnetization at 4.2 K and 57 Fe Moessbauer spectroscopy. The samples, prepared by sol-gel autocombustion, have different thermal history and particle size. The differences observed in the saturation magnetization of these samples are explained in terms of different inversion degrees, as confirmed by the analysis of the components in the Moessbauer spectra. It is notable that the inversion degrees of the samples investigated are set among the highest values reported in the literature.

  1. Using ALFA for high throughput, distributed data transmission in the ALICE O2 system

    Science.gov (United States)

    Wegrzynek, A.; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is a heavy-ion detector designed to study the physics of strongly interacting matter (the Quark-Gluon Plasma at the CERN LHC (Large Hadron Collider). ALICE has been successfully collecting physics data in Run 2 since spring 2015. In parallel, preparations for a major upgrade of the computing system, called O2 (Online-Offline), scheduled for the Long Shutdown 2 in 2019-2020, are being made. One of the major requirements of the system is the capacity to transport data between so-called FLPs (First Level Processors), equipped with readout cards, and the EPNs (Event Processing Node), performing data aggregation, frame building and partial reconstruction. It is foreseen to have 268 FLPs dispatching data to 1500 EPNs with an average output of 20 Gb/s each. In overall, the O2 processing system will operate at terabits per second of throughput while handling millions of concurrent connections. The ALFA framework will standardize and handle software related tasks such as readout, data transport, frame building, calibration, online reconstruction and more in the upgraded computing system. ALFA supports two data transport libraries: ZeroMQ and nanomsg. This paper discusses the efficiency of ALFA in terms of high throughput data transport. The tests were performed with multiple FLPs pushing data to multiple EPNs. The transfer was done using push-pull communication patterns and two socket configurations: bind, connect. The set of benchmarks was prepared to get the most performant results on each hardware setup. The paper presents the measurement process and final results - data throughput combined with computing resources usage as a function of block size. The high number of nodes and connections in the final set up may cause race conditions that can lead to uneven load balancing and poor scalability. The performed tests allow us to validate whether the traffic is distributed evenly over all receivers. It also measures the behaviour of

  2. Effects of the Strain Rate Sensitivity and Strain Hardening on the Saturated Impulse of Plates

    Directory of Open Access Journals (Sweden)

    Ling Zhu

    Full Text Available Abstract This paper studies the stiffening effects of the material strain rate sensitivity and strain hardening on the saturated impulse of elastic, perfectly plastic plates. Finite element (FE code ABAQUS is employed to simulate the elastoplastic response of square plates under rectangular pressure pulse. Rigid-plastic analyses for saturated impulse, which consider strain rate sensitivity and strain hardening, are conducted. Satisfactory agreement between the finite element models (FEM and predictions of the rigid-plastic analysis is obtained, which verifies that the proposed rigid-plastic methods are effective to solve the problem including strain rate sensitivity and strain hardening. The quantitative results for the scale effect of the strain rate sensitivity are given. The results for the stiffening effects suggest that two general stiffening factors n 1 and n 2, which characterizes the strain rate sensitivity and strain hardening effect, respectively can be defined. The saturated displacement is inversely proportional to the stiffening factors (i.e. n 1 and n 2 and saturated impulse is inversely proportional to the square roots of the stiffening factors (i.e. n 1 and n 2. Formulae for displacement and saturated impulse are proposed based on the empirical analysis.

  3. High-throughput peptide mass fingerprinting and protein macroarray analysis using chemical printing strategies

    International Nuclear Information System (INIS)

    Sloane, A.J.; Duff, J.L.; Hopwood, F.G.; Wilson, N.L.; Smith, P.E.; Hill, C.J.; Packer, N.H.; Williams, K.L.; Gooley, A.A.; Cole, R.A.; Cooley, P.W.; Wallace, D.B.

    2001-01-01

    We describe a 'chemical printer' that uses piezoelectric pulsing for rapid and accurate microdispensing of picolitre volumes of fluid for proteomic analysis of 'protein macroarrays'. Unlike positive transfer and pin transfer systems, our printer dispenses fluid in a non-contact process that ensures that the fluid source cannot be contaminated by substrate during a printing event. We demonstrate automated delivery of enzyme and matrix solutions for on-membrane protein digestion and subsequent peptide mass fingerprinting (pmf) analysis directly from the membrane surface using matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS). This approach bypasses the more commonly used multi-step procedures, thereby permitting a more rapid procedure for protein identification. We also highlight the advantage of printing different chemistries onto an individual protein spot for multiple microscale analyses. This ability is particularly useful when detailed characterisation of rare and valuable sample is required. Using a combination of PNGase F and trypsin we have mapped sites of N-glycosylation using on-membrane digestion strategies. We also demonstrate the ability to print multiple serum samples in a micro-ELISA format and rapidly screen a protein macroarray of human blood plasma for pathogen-derived antigens. We anticipate that the 'chemical printer' will be a major component of proteomic platforms for high-throughput protein identification and characterisation with widespread applications in biomedical and diagnostic discovery

  4. A rapid enzymatic assay for high-throughput screening of adenosine-producing strains

    Science.gov (United States)

    Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei

    2015-01-01

    Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842

  5. Saturation and linear transport equation

    International Nuclear Information System (INIS)

    Kutak, K.

    2009-03-01

    We show that the GBW saturation model provides an exact solution to the one dimensional linear transport equation. We also show that it is motivated by the BK equation considered in the saturated regime when the diffusion and the splitting term in the diffusive approximation are balanced by the nonlinear term. (orig.)

  6. Raman-Activated Droplet Sorting (RADS) for Label-Free High-Throughput Screening of Microalgal Single-Cells.

    Science.gov (United States)

    Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo

    2017-11-21

    Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.

  7. Optical tools for high-throughput screening of abrasion resistance of combinatorial libraries of organic coatings

    Science.gov (United States)

    Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.

    2002-02-01

    Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.

  8. High Throughput Transcriptomics @ USEPA (Toxicology ...

    Science.gov (United States)

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  9. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    Science.gov (United States)

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  10. High-throughput analysis of ammonia oxidiser community composition via a novel, amoA-based functional gene array.

    Directory of Open Access Journals (Sweden)

    Guy C J Abell

    Full Text Available Advances in microbial ecology research are more often than not limited by the capabilities of available methodologies. Aerobic autotrophic nitrification is one of the most important and well studied microbiological processes in terrestrial and aquatic ecosystems. We have developed and validated a microbial diagnostic microarray based on the ammonia-monooxygenase subunit A (amoA gene, enabling the in-depth analysis of the community structure of bacterial and archaeal ammonia oxidisers. The amoA microarray has been successfully applied to analyse nitrifier diversity in marine, estuarine, soil and wastewater treatment plant environments. The microarray has moderate costs for labour and consumables and enables the analysis of hundreds of environmental DNA or RNA samples per week per person. The array has been thoroughly validated with a range of individual and complex targets (amoA clones and environmental samples, respectively, combined with parallel analysis using traditional sequencing methods. The moderate cost and high throughput of the microarray makes it possible to adequately address broader questions of the ecology of microbial ammonia oxidation requiring high sample numbers and high resolution of the community composition.

  11. Time-motion analysis of factors affecting patient throughput in an MR imaging center

    International Nuclear Information System (INIS)

    O'Donohue, J.; Enzmann, D.R.

    1986-01-01

    The high cost of MR imaging makes efficient use essential. In an effort to increase patient throughput, attention has been focused on shortening the imaging time through reductions in matrix size and number of excitations, and through the use of newer ''fast imaging'' techniques. Less attention has been given to other time-consuming aspects not directly related to imaging time. The authors undertook a time-motion study using a daily log of minute-by-minute activities associated with an MR imaging examination. The times required for the following components of the examination were measured: total study time, examination set-up time, intrastudy physician ''image review'' time, and interstudy patient turnover time. The time lost to claustrophobic reactions, patients' failure to appear for scheduled examinations, unanticipated patient care (sedation, reassurance), and equipment malfunction was also analyzed. Actual imaging time accounted for a relatively small proportion (42%) of total study time. Other factors such as intrastudy image review time (15%), interstudy patient turnover time (11%), and time lost due to claustrophobic reactions, patients' failure to appear for scheduled examinations, and equipment malfunction contributed significantly to the total study time. Simple solutions to these problems can contribute greatly to increasing patient throughput

  12. The Danish tax on saturated fat

    DEFF Research Database (Denmark)

    Jensen, Jørgen Dejgård; Smed, Sinne

    2013-01-01

    and oils. This assessment is done by conducting an econometric analysis on weekly food purchase data from a large household panel dataset (GfK Consumer Tracking Scandinavia), spanning the period from January 2008 until July 2012.The econometric analysis suggest that the introduction of the tax on saturated...... and fats, a shift that seems to have been utilized by discount chains to raise the prices of butter and margarine by more than the pure tax increase. Due to the relatively short data period with the tax being active, interpretation of these findings from a long-run perspective should be done...... with considerable care. It is thus recommended to repeat – and broaden – the analysis at a later stage, when data are available for a longer period after the introduction of the fat tax....

  13. Bacterial Pathogens and Community Composition in Advanced Sewage Treatment Systems Revealed by Metagenomics Analysis Based on High-Throughput Sequencing

    Science.gov (United States)

    Lu, Xin; Zhang, Xu-Xiang; Wang, Zhu; Huang, Kailong; Wang, Yuan; Liang, Weigang; Tan, Yunfei; Liu, Bo; Tang, Junying

    2015-01-01

    This study used 454 pyrosequencing, Illumina high-throughput sequencing and metagenomic analysis to investigate bacterial pathogens and their potential virulence in a sewage treatment plant (STP) applying both conventional and advanced treatment processes. Pyrosequencing and Illumina sequencing consistently demonstrated that Arcobacter genus occupied over 43.42% of total abundance of potential pathogens in the STP. At species level, potential pathogens Arcobacter butzleri, Aeromonas hydrophila and Klebsiella pneumonia dominated in raw sewage, which was also confirmed by quantitative real time PCR. Illumina sequencing also revealed prevalence of various types of pathogenicity islands and virulence proteins in the STP. Most of the potential pathogens and virulence factors were eliminated in the STP, and the removal efficiency mainly depended on oxidation ditch. Compared with sand filtration, magnetic resin seemed to have higher removals in most of the potential pathogens and virulence factors. However, presence of the residual A. butzleri in the final effluent still deserves more concerns. The findings indicate that sewage acts as an important source of environmental pathogens, but STPs can effectively control their spread in the environment. Joint use of the high-throughput sequencing technologies is considered a reliable method for deep and comprehensive overview of environmental bacterial virulence. PMID:25938416

  14. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  15. High Throughput PBTK: Open-Source Data and Tools for ...

    Science.gov (United States)

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  16. Effective stress principle for partially saturated media

    International Nuclear Information System (INIS)

    McTigue, D.F.; Wilson, R.K.; Nunziato, J.W.

    1984-04-01

    In support of the Nevada Nuclear Waste Storage Investigation (NNWSI) Project, we have undertaken a fundamental study of water migration in partially saturated media. One aspect of that study, on which we report here, has been to use the continuum theory of mixtures to extend the classical notion of effective stress to partially saturated media. Our analysis recovers previously proposed phenomenological representations for the effective stress in terms of the capillary pressure. The theory is illustrated by specializing to the case of linear poroelasticity, for which we calculate the deformation due to the fluid pressure in a static capillary fringe. We then examine the transient consolidation associated with liquid flow induced by an applied surface load. Settlement accompanies this flow as the liquid is redistributed by a nonlinear diffusion process. For material properties characteristic of tuff from the Nevada Test Site, these effects are found to be vanishingly small. 14 references, 7 figures, 1 table

  17. High-throughput functional screening of steroid substrates with wild-type and chimeric P450 enzymes.

    Science.gov (United States)

    Urban, Philippe; Truan, Gilles; Pompon, Denis

    2014-01-01

    The promiscuity of a collection of enzymes consisting of 31 wild-type and synthetic variants of CYP1A enzymes was evaluated using a series of 14 steroids and 2 steroid-like chemicals, namely, nootkatone, a terpenoid, and mifepristone, a drug. For each enzyme-substrate couple, the initial steady-state velocity of metabolite formation was determined at a substrate saturating concentration. For that, a high-throughput approach was designed involving automatized incubations in 96-well microplate with sixteen 6-point kinetics per microplate and data acquisition using LC/MS system accepting 96-well microplate for injections. The resulting dataset was used for multivariate statistics aimed at sorting out the correlations existing between tested enzyme variants and ability to metabolize steroid substrates. Functional classifications of both CYP1A enzyme variants and steroid substrate structures were obtained allowing the delineation of global structural features for both substrate recognition and regioselectivity of oxidation.

  18. Gluon Saturation and EIC

    Energy Technology Data Exchange (ETDEWEB)

    Sichtermann, Ernst

    2016-12-15

    The fundamental structure of nucleons and nuclear matter is described by the properties and dynamics of quarks and gluons in quantum chromodynamics. Electron-nucleon collisions are a powerful method to study this structure. As one increases the energy of the collisions, the interaction process probes regions of progressively higher gluon density. This density must eventually saturate. An high-energy polarized Electron-Ion Collider (EIC) has been proposed to observe and study the saturated gluon density regime. Selected measurements will be discussed, following a brief introduction.

  19. Super-Hydrophobic High Throughput Electrospun Cellulose Acetate (CA) Nanofibrous Mats as Oil Selective Sorbents

    Science.gov (United States)

    Han, Chao

    The threat of oil pollution increases with the expansion of oil exploration and production activities, as well as the industrial growth around the world. Use of sorbents is a common method to deal with the oil spills. In this work, an advanced sorbent technology is described. A series of non-woven Cellulose Acetate (CA) nanofibrous mats with a 3D fibrous structure were synthesized by a novel high-throughput electrospinning technique. The precursor was solutions of CA/ acetic acid-acetone in various concentrations. Among them, 15.0% CA exhibits a superhydrophobic surface property, with a water contact angle of 128.95°. Its oil sorption capacity is many times higher the oil sorption capacity of the best commercial sorbent available in the market. Also, it showed good buoyancy properties on the water both as dry-mat and oil-saturated mat. In addition, it is biodegradable, easily available, easily manufactured, so the CA nanofibrous mat is an excellent candidate as oil sorbent for oil spill in water treatment.

  20. ImmuneDB: a system for the analysis and exploration of high-throughput adaptive immune receptor sequencing data.

    Science.gov (United States)

    Rosenfeld, Aaron M; Meng, Wenzhao; Luning Prak, Eline T; Hershberg, Uri

    2017-01-15

    As high-throughput sequencing of B cells becomes more common, the need for tools to analyze the large quantity of data also increases. This article introduces ImmuneDB, a system for analyzing vast amounts of heavy chain variable region sequences and exploring the resulting data. It can take as input raw FASTA/FASTQ data, identify genes, determine clones, construct lineages, as well as provide information such as selection pressure and mutation analysis. It uses an industry leading database, MySQL, to provide fast analysis and avoid the complexities of using error prone flat-files. ImmuneDB is freely available at http://immunedb.comA demo of the ImmuneDB web interface is available at: http://immunedb.com/demo CONTACT: Uh25@drexel.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Throughput-Based Traffic Steering in LTE-Advanced HetNet Deployments

    DEFF Research Database (Denmark)

    Gimenez, Lucas Chavarria; Kovacs, Istvan Z.; Wigard, Jeroen

    2015-01-01

    The objective of this paper is to propose traffic steering solutions that aim at optimizing the end-user throughput. Two different implementations of an active mode throughput-based traffic steering algorithm for Heterogeneous Networks (HetNet) are introduced. One that always forces handover of t...... throughput is generally higher, reaching values of 36% and 18% for the medium- and high-load conditions....

  2. A high throughput mass spectrometry screening analysis based on two-dimensional carbon microfiber fractionation system.

    Science.gov (United States)

    Ma, Biao; Zou, Yilin; Xie, Xuan; Zhao, Jinhua; Piao, Xiangfan; Piao, Jingyi; Yao, Zhongping; Quinto, Maurizio; Wang, Gang; Li, Donghao

    2017-06-09

    A novel high-throughput, solvent saving and versatile integrated two-dimensional microscale carbon fiber/active carbon fiber system (2DμCFs) that allows a simply and rapid separation of compounds in low-polar, medium-polar and high-polar fractions, has been coupled with ambient ionization-mass spectrometry (ESI-Q-TOF-MS and ESI-QqQ-MS) for screening and quantitative analyses of real samples. 2DμCFs led to a substantial interference reduction and minimization of ionization suppression effects, thus increasing the sensitivity and the screening capabilities of the subsequent MS analysis. The method has been applied to the analysis of Schisandra Chinensis extracts, obtaining with a single injection a simultaneous determination of 33 compounds presenting different polarities, such as organic acids, lignans, and flavonoids in less than 7min, at low pressures and using small solvent amounts. The method was also validated using 10 model compounds, giving limit of detections (LODs) ranging from 0.3 to 30ngmL -1 , satisfactory recoveries (from 75.8 to 93.2%) and reproducibilities (relative standard deviations, RSDs, from 1.40 to 8.06%). Copyright © 2017 Elsevier B.V. All rights reserved.

  3. High-throughput genetic analysis in a cohort of patients with Ocular Developmental Anomalies

    Directory of Open Access Journals (Sweden)

    Suganya Kandeeban

    2017-10-01

    Full Text Available Anophthalmia and microphthalmia (A/M are developmental ocular malformations in which the eye fails to form or is smaller than normal with both genetic and environmental etiology. Microphthalmia is often associated with additional ocular anomalies, most commonly coloboma or cataract [1, 2]. A/M has a combined incidence between 1-3.2 cases per 10,000 live births in Caucasians [3, 4]. The spectrum of genetic abnormalities (chromosomal and molecular associated with these ocular developmental defects are being investigated in the current study. A detailed pedigree analysis and ophthalmic examination have been documented for the enrolled patients followed by blood collection and DNA extraction. The strategies for genetic analysis included chromosomal analysis by conventional and array based (affymetrix cytoscan HD array methods, targeted re-sequencing of the candidate genes and whole exome sequencing (WES in Illumina HiSEQ 2500. WES was done in families excluded for mutations in candidate genes. Twenty four samples (Microphthalmia (M-5, Anophthalmia (A-7,Coloboma-2, M&A-1, microphthalmia and coloboma / other ocular features-9 were initially analyzed using conventional Geimsa Trypsin Geimsa banding of which 4 samples revealed gross chromosomal aberrations (deletions in 3q26.3-28, 11p13 (N=2 and 11q23 regions. Targeted re sequencing of candidate genes showed mutations in CHX10, PAX6, FOXE3, ABCB6 and SHH genes in 6 samples. High throughput array based chromosomal analysis revealed aberrations in 4 samples (17q21dup (n=2, 8p11del (n=2. Overall, genetic alterations in known candidate genes are seen in 50% of the study subjects. Whole exome sequencing was performed in samples that were excluded for mutations in candidate genes and the results are discussed.

  4. High-Throughput Quantitative Proteomic Analysis of Dengue Virus Type 2 Infected A549 Cells

    Science.gov (United States)

    Chiu, Han-Chen; Hannemann, Holger; Heesom, Kate J.; Matthews, David A.; Davidson, Andrew D.

    2014-01-01

    Disease caused by dengue virus is a global health concern with up to 390 million individuals infected annually worldwide. There are no vaccines or antiviral compounds available to either prevent or treat dengue disease which may be fatal. To increase our understanding of the interaction of dengue virus with the host cell, we analyzed changes in the proteome of human A549 cells in response to dengue virus type 2 infection using stable isotope labelling in cell culture (SILAC) in combination with high-throughput mass spectrometry (MS). Mock and infected A549 cells were fractionated into nuclear and cytoplasmic extracts before analysis to identify proteins that redistribute between cellular compartments during infection and reduce the complexity of the analysis. We identified and quantified 3098 and 2115 proteins in the cytoplasmic and nuclear fractions respectively. Proteins that showed a significant alteration in amount during infection were examined using gene enrichment, pathway and network analysis tools. The analyses revealed that dengue virus infection modulated the amounts of proteins involved in the interferon and unfolded protein responses, lipid metabolism and the cell cycle. The SILAC-MS results were validated for a select number of proteins over a time course of infection by Western blotting and immunofluorescence microscopy. Our study demonstrates for the first time the power of SILAC-MS for identifying and quantifying novel changes in cellular protein amounts in response to dengue virus infection. PMID:24671231

  5. High-throughput quantitative proteomic analysis of dengue virus type 2 infected A549 cells.

    Directory of Open Access Journals (Sweden)

    Han-Chen Chiu

    Full Text Available Disease caused by dengue virus is a global health concern with up to 390 million individuals infected annually worldwide. There are no vaccines or antiviral compounds available to either prevent or treat dengue disease which may be fatal. To increase our understanding of the interaction of dengue virus with the host cell, we analyzed changes in the proteome of human A549 cells in response to dengue virus type 2 infection using stable isotope labelling in cell culture (SILAC in combination with high-throughput mass spectrometry (MS. Mock and infected A549 cells were fractionated into nuclear and cytoplasmic extracts before analysis to identify proteins that redistribute between cellular compartments during infection and reduce the complexity of the analysis. We identified and quantified 3098 and 2115 proteins in the cytoplasmic and nuclear fractions respectively. Proteins that showed a significant alteration in amount during infection were examined using gene enrichment, pathway and network analysis tools. The analyses revealed that dengue virus infection modulated the amounts of proteins involved in the interferon and unfolded protein responses, lipid metabolism and the cell cycle. The SILAC-MS results were validated for a select number of proteins over a time course of infection by Western blotting and immunofluorescence microscopy. Our study demonstrates for the first time the power of SILAC-MS for identifying and quantifying novel changes in cellular protein amounts in response to dengue virus infection.

  6. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rehfeldt

    2004-10-08

    This report is an updated analysis of water-level data performed to provide the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]) (referred to as the saturated zone (SZ) site-scale flow model or site-scale SZ flow model in this report) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for calibration of groundwater flow models. This report also contains an expanded discussion of uncertainty in the potentiometric-surface map. The analysis of the potentiometric data presented in Revision 00 of this report (USGS 2001 [DIRS 154625]) provides the configuration of the potentiometric surface, target heads, and hydraulic gradients for the calibration of the SZ site-scale flow model (BSC 2004 [DIRS 170037]). Revision 01 of this report (USGS 2004 [DIRS 168473]) used updated water-level data for selected wells through the year 2000 as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain based on an alternative interpretation of perched water conditions. That revision developed computer files containing: Water-level data within the model area (DTN: GS010908312332.002); A table of known vertical head differences (DTN: GS010908312332.003); and A potentiometric-surface map (DTN: GS010608312332.001) using an alternative concept from that presented by USGS (2001 [DIRS 154625]) for the area north of Yucca Mountain. The updated water-level data presented in USGS (2004 [DIRS 168473]) include data obtained from the Nye County Early Warning Drilling Program (EWDP) Phases I and II and data from Borehole USW WT-24. This document is based on Revision 01 (USGS 2004 [DIRS 168473]) and expands the discussion of uncertainty in the potentiometric-surface map. This uncertainty assessment includes an analysis of the impact of more recent water-level data and the impact of adding data from the EWDP Phases III and IV wells. In

  7. Parallel workflow for high-throughput (>1,000 samples/day quantitative analysis of human insulin-like growth factor 1 using mass spectrometric immunoassay.

    Directory of Open Access Journals (Sweden)

    Paul E Oran

    Full Text Available Insulin-like growth factor 1 (IGF1 is an important biomarker for the management of growth hormone disorders. Recently there has been rising interest in deploying mass spectrometric (MS methods of detection for measuring IGF1. However, widespread clinical adoption of any MS-based IGF1 assay will require increased throughput and speed to justify the costs of analyses, and robust industrial platforms that are reproducible across laboratories. Presented here is an MS-based quantitative IGF1 assay with performance rating of >1,000 samples/day, and a capability of quantifying IGF1 point mutations and posttranslational modifications. The throughput of the IGF1 mass spectrometric immunoassay (MSIA benefited from a simplified sample preparation step, IGF1 immunocapture in a tip format, and high-throughput MALDI-TOF MS analysis. The Limit of Detection and Limit of Quantification of the resulting assay were 1.5 μg/L and 5 μg/L, respectively, with intra- and inter-assay precision CVs of less than 10%, and good linearity and recovery characteristics. The IGF1 MSIA was benchmarked against commercially available IGF1 ELISA via Bland-Altman method comparison test, resulting in a slight positive bias of 16%. The IGF1 MSIA was employed in an optimized parallel workflow utilizing two pipetting robots and MALDI-TOF-MS instruments synced into one-hour phases of sample preparation, extraction and MSIA pipette tip elution, MS data collection, and data processing. Using this workflow, high-throughput IGF1 quantification of 1,054 human samples was achieved in approximately 9 hours. This rate of assaying is a significant improvement over existing MS-based IGF1 assays, and is on par with that of the enzyme-based immunoassays. Furthermore, a mutation was detected in ∼1% of the samples (SNP: rs17884626, creating an A→T substitution at position 67 of the IGF1, demonstrating the capability of IGF1 MSIA to detect point mutations and posttranslational modifications.

  8. Energy dependent saturable and reverse saturable absorption in cube-like polyaniline/polymethyl methacrylate film

    Energy Technology Data Exchange (ETDEWEB)

    Thekkayil, Remyamol [Department of Chemistry, Indian Institute of Space Science and Technology, Valiamala, Thiruvananthapuram 695 547 (India); Philip, Reji [Light and Matter Physics Group, Raman Research Institute, C.V. Raman Avenue, Bangalore 560 080 (India); Gopinath, Pramod [Department of Physics, Indian Institute of Space Science and Technology, Valiamala, Thiruvananthapuram 695 547 (India); John, Honey, E-mail: honey@iist.ac.in [Department of Chemistry, Indian Institute of Space Science and Technology, Valiamala, Thiruvananthapuram 695 547 (India)

    2014-08-01

    Solid films of cube-like polyaniline synthesized by inverse microemulsion polymerization method have been fabricated in a transparent PMMA host by an in situ free radical polymerization technique, and are characterized by spectroscopic and microscopic techniques. The nonlinear optical properties are studied by open aperture Z-scan technique employing 5 ns (532 nm) and 100 fs (800 nm) laser pulses. At the relatively lower laser pulse energy of 5 μJ, the film shows saturable absorption both in the nanosecond and femtosecond excitation domains. An interesting switchover from saturable absorption to reverse saturable absorption is observed at 532 nm when the energy of the nanosecond laser pulses is increased. The nonlinear absorption coefficient increases with increase in polyaniline concentration, with low optical limiting threshold, as required for a good optical limiter. - Highlights: • Synthesized cube-like polyaniline nanostructures. • Fabricated polyaniline/PMMA nanocomposite films. • At 5 μJ energy, saturable absorption is observed both at ns and fs regime. • Switchover from SA to RSA is observed as energy of laser beam increases. • Film (0.1 wt % polyaniline) shows high β{sub eff} (230 cm GW{sup −1}) and low limiting threshold at 150 μJ.

  9. High-throughput differentiation of heparin from other glycosaminoglycans by pyrolysis mass spectrometry.

    Science.gov (United States)

    Nemes, Peter; Hoover, William J; Keire, David A

    2013-08-06

    Sensors with high chemical specificity and enhanced sample throughput are vital to screening food products and medical devices for chemical or biochemical contaminants that may pose a threat to public health. For example, the rapid detection of oversulfated chondroitin sulfate (OSCS) in heparin could prevent reoccurrence of heparin adulteration that caused hundreds of severe adverse events including deaths worldwide in 2007-2008. Here, rapid pyrolysis is integrated with direct analysis in real time (DART) mass spectrometry to rapidly screen major glycosaminoglycans, including heparin, chondroitin sulfate A, dermatan sulfate, and OSCS. The results demonstrate that, compared to traditional liquid chromatography-based analyses, pyrolysis mass spectrometry achieved at least 250-fold higher sample throughput and was compatible with samples volume-limited to about 300 nL. Pyrolysis yielded an abundance of fragment ions (e.g., 150 different m/z species), many of which were specific to the parent compound. Using multivariate and statistical data analysis models, these data enabled facile differentiation of the glycosaminoglycans with high throughput. After method development was completed, authentically contaminated samples obtained during the heparin crisis by the FDA were analyzed in a blinded manner for OSCS contamination. The lower limit of differentiation and detection were 0.1% (w/w) OSCS in heparin and 100 ng/μL (20 ng) OSCS in water, respectively. For quantitative purposes the linear dynamic range spanned approximately 3 orders of magnitude. Moreover, this chemical readout was successfully employed to find clues in the manufacturing history of the heparin samples that can be used for surveillance purposes. The presented technology and data analysis protocols are anticipated to be readily adaptable to other chemical and biochemical agents and volume-limited samples.

  10. MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra

    Science.gov (United States)

    Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.

    2018-04-01

    The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.

  11. Potential for saturated ground-water system contamination at the Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Stone, R.; Ruggieri, M.R.; Rogers, L.L.; Emerson, D.O.; Buddemeier, R.W.

    1982-01-01

    A program of hydrogeologic investigation has been carried out to determine the likelihood of contaminant movement to the saturated zone from near the ground surface at Lawrence Livermore National Laboratory (LLNL). A companion survey of potential contaminant sources was also conducted at the LLNL. Water samples from selected LLNL wells were analyzed to test the water quality in the uppermost part of the saturated zone, which is from 14 to 48 m (45 to 158 ft) beneath the surface. Only nitrate and tritium were found in concentrations above natural background. In one well, the nitrate was slightly more concentrated than the drinking water limit. The nitrate source has not been found. The tritium in all ground-water samples from wells was found far less concentrated than the drinking water limit. The extent of infiltration of surface water was traced with environmental tritium. The thickness and stratigraphy of the unsaturated zone beneath the LLNL, and nearby area, was determined with specially constructed wells and boreholes. Well hydrograph analysis indicated where infiltration of surface water reached the saturated ground-water system. The investigation indicates that water infiltrating from the surface, through alluvial deposits, reaches the saturated zone along the course of Arroyo Seco, Arroyo Las Positas, and from the depression near the center of the site where seasonal water accumulates. Several potential contaminant sources were identified, and it is likely that contaminants could move from near the ground surface to the saturated zone beneath LLNL. Additional ground-water sampling and analysis will be performed and ongoing investigations will provide estimates of the speed with which potential contaminants can flow laterally in the saturated zone beneath LLNL. 34 references, 61 figures, 16 tables

  12. High throughput screening method for assessing heterogeneity of microorganisms

    NARCIS (Netherlands)

    Ingham, C.J.; Sprenkels, A.J.; van Hylckama Vlieg, J.E.T.; Bomer, Johan G.; de Vos, W.M.; van den Berg, Albert

    2006-01-01

    The invention relates to the field of microbiology. Provided is a method which is particularly powerful for High Throughput Screening (HTS) purposes. More specific a high throughput method for determining heterogeneity or interactions of microorganisms is provided.

  13. Modelling suction instabilities in soils at varying degrees of saturation

    Directory of Open Access Journals (Sweden)

    Buscarnera Giuseppe

    2016-01-01

    Full Text Available Wetting paths imparted by the natural environment and/or human activities affect the state of soils in the near-surface, promoting transitions across different regimes of saturation. This paper discusses a set of techniques aimed at quantifying the role of hydrologic processes on the hydro-mechanical stability of soil specimens subjected to saturation events. Emphasis is given to the mechanical conditions leading to coupled flow/deformation instabilities. For this purpose, energy balance arguments for three-phase systems are used to derive second-order work expressions applicable to various regimes of saturation. Controllability analyses are then performed to relate such work input with constitutive singularities that reflect the loss of strength under coupled and/or uncoupled hydro-mechanical forcing. A suction-dependent plastic model is finally used to track the evolution of stability conditions in samples subjected to wetting, thus quantifying the growth of the potential for coupled failure modes upon increasing degree of saturation. These findings are eventually linked with the properties of the field equations that govern pore pressure transients, thus disclosing a conceptual link between the onset of coupled hydro-mechanical failures and the evolution of suction with time. Such results point out that mathematical instabilities caused by a non-linear suction dependent behaviour play an important role in the advanced constitutive and/or numerical tools that are commonly used for the analysis of geomechanical problems in the unsaturated zone, and further stress that the relation between suction transients and soil deformations is a key factor for the interpretation of runaway failures caused by intense saturation events.

  14. Lead discovery for mammalian elongation of long chain fatty acids family 6 using a combination of high-throughput fluorescent-based assay and RapidFire mass spectrometry assay

    International Nuclear Information System (INIS)

    Takamiya, Mari; Sakurai, Masaaki; Teranishi, Fumie; Ikeda, Tomoko; Kamiyama, Tsutomu; Asai, Akira

    2016-01-01

    A high-throughput RapidFire mass spectrometry assay is described for elongation of very long-chain fatty acids family 6 (Elovl6). Elovl6 is a microsomal enzyme that regulates the elongation of C12-16 saturated and monounsaturated fatty acids. Elovl6 may be a new therapeutic target for fat metabolism disorders such as obesity, type 2 diabetes, and nonalcoholic steatohepatitis. To identify new Elovl6 inhibitors, we developed a high-throughput fluorescence screening assay in 1536-well format. However, a number of false positives caused by fluorescent interference have been identified. To pick up the real active compounds among the primary hits from the fluorescence assay, we developed a RapidFire mass spectrometry assay and a conventional radioisotope assay. These assays have the advantage of detecting the main products directly without using fluorescent-labeled substrates. As a result, 276 compounds (30%) of the primary hits (921 compounds) in a fluorescence ultra-high-throughput screening method were identified as common active compounds in these two assays. It is concluded that both methods are very effective to eliminate false positives. Compared with the radioisotope method using an expensive 14 C-labeled substrate, the RapidFire mass spectrometry method using unlabeled substrates is a high-accuracy, high-throughput method. In addition, some of the hit compounds selected from the screening inhibited cellular fatty acid elongation in HEK293 cells expressing Elovl6 transiently. This result suggests that these compounds may be promising lead candidates for therapeutic drugs. Ultra-high-throughput fluorescence screening followed by a RapidFire mass spectrometry assay was a suitable strategy for lead discovery against Elovl6. - Highlights: • A novel assay for elongation of very-long-chain fatty acids 6 (Elovl6) is proposed. • RapidFire mass spectrometry (RF-MS) assay is useful to select real screening hits. • RF-MS assay is proved to be beneficial because of

  15. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    Science.gov (United States)

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  16. Extensive impact of saturated fatty acids on metabolic and cardiovascular profile in rats with diet-induced obesity: a canonical analysis.

    Science.gov (United States)

    Oliveira Junior, Silvio A; Padovani, Carlos R; Rodrigues, Sergio A; Silva, Nilza R; Martinez, Paula F; Campos, Dijon Hs; Okoshi, Marina P; Okoshi, Katashi; Dal-Pai, Maeli; Cicogna, Antonio C

    2013-04-15

    Although hypercaloric interventions are associated with nutritional, endocrine, metabolic, and cardiovascular disorders in obesity experiments, a rational distinction between the effects of excess adiposity and the individual roles of dietary macronutrients in relation to these disturbances has not previously been studied. This investigation analyzed the correlation between ingested macronutrients (including sucrose and saturated and unsaturated fatty acids) plus body adiposity and metabolic, hormonal, and cardiovascular effects in rats with diet-induced obesity. Normotensive Wistar-Kyoto rats were submitted to Control (CD; 3.2 Kcal/g) and Hypercaloric (HD; 4.6 Kcal/g) diets for 20 weeks followed by nutritional evaluation involving body weight and adiposity measurement. Metabolic and hormonal parameters included glycemia, insulin, insulin resistance, and leptin. Cardiovascular analysis included systolic blood pressure profile, echocardiography, morphometric study of myocardial morphology, and myosin heavy chain (MHC) protein expression. Canonical correlation analysis was used to evaluate the relationships between dietary macronutrients plus adiposity and metabolic, hormonal, and cardiovascular parameters. Although final group body weights did not differ, HD presented higher adiposity than CD. Diet induced hyperglycemia while insulin and leptin levels remained unchanged. In a cardiovascular context, systolic blood pressure increased with time only in HD. Additionally, in vivo echocardiography revealed cardiac hypertrophy and improved systolic performance in HD compared to CD; and while cardiomyocyte size was unchanged by diet, nuclear volume and collagen interstitial fraction both increased in HD. Also HD exhibited higher relative β-MHC content and β/α-MHC ratio than their Control counterparts. Importantly, body adiposity was weakly associated with cardiovascular effects, as saturated fatty acid intake was directly associated with most cardiac remodeling

  17. HTTK R Package v1.4 - JSS Article on HTTK: R Package for High-Throughput Toxicokinetics

    Data.gov (United States)

    U.S. Environmental Protection Agency — httk: High-Throughput Toxicokinetics Functions and data tables for simulation and statistical analysis of chemical toxicokinetics ("TK") using data obtained from...

  18. Validation of a Microscale Extraction and High Throughput UHPLC-QTOF-MS Analysis Method for Huperzine A in Huperzia

    Science.gov (United States)

    Cuthbertson, Daniel; Piljac-Žegarac, Jasenka; Lange, Bernd Markus

    2011-01-01

    Herein we report on an improved method for the microscale extraction of huperzine A (HupA), an acetylcholinesterase-inhibiting alkaloid, from as little as 3 mg of tissue homogenate from the clubmoss Huperzia squarrosa (G. Forst.) Trevis with 99.95 % recovery. We also validated a novel UHPLC-QTOF-MS method for the high-throughput analysis of H. squarrosa extracts in only 6 min, which, in combination with the very low limit of detection (20 pg on column) and the wide linear range for quantification (20 to 10,000 pg on column), allow for a highly efficient screening of extracts containing varying amounts of HupA. Utilization of this methodology has the potential to conserve valuable plant resources. PMID:22275140

  19. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. Galaxy Workflows for Web-based Bioinformatics Analysis of Aptamer High-throughput Sequencing Data

    Directory of Open Access Journals (Sweden)

    William H Thiel

    2016-01-01

    Full Text Available Development of RNA and DNA aptamers for diagnostic and therapeutic applications is a rapidly growing field. Aptamers are identified through iterative rounds of selection in a process termed SELEX (Systematic Evolution of Ligands by EXponential enrichment. High-throughput sequencing (HTS revolutionized the modern SELEX process by identifying millions of aptamer sequences across multiple rounds of aptamer selection. However, these vast aptamer HTS datasets necessitated bioinformatics techniques. Herein, we describe a semiautomated approach to analyze aptamer HTS datasets using the Galaxy Project, a web-based open source collection of bioinformatics tools that were originally developed to analyze genome, exome, and transcriptome HTS data. Using a series of Workflows created in the Galaxy webserver, we demonstrate efficient processing of aptamer HTS data and compilation of a database of unique aptamer sequences. Additional Workflows were created to characterize the abundance and persistence of aptamer sequences within a selection and to filter sequences based on these parameters. A key advantage of this approach is that the online nature of the Galaxy webserver and its graphical interface allow for the analysis of HTS data without the need to compile code or install multiple programs.

  1. Salt-saturated concrete strength and permeability

    International Nuclear Information System (INIS)

    Pfeifle, T.W.; Hansen, F.D.; Knowles, M.K.

    1996-01-01

    Laboratory-scale experiments applicable to the use of salt-saturated concrete as a seal material for a transuranic waste repository have been completed. Nitrogen gas permeability measurements were made using a flexible-wall permeameter, a confining pressure of 1 MPa, and gas pressure gradients ranging from 0.3 MPa to 0.75 MPa. Results show that salt-saturated concrete has very low intrinsic permeability with values ranging from 9.4 x 10 -22 m 2 to 9.7 x 10 -17 m 2 . Strength and deformation characteristics were investigated under conditions of triaxial compression with confining pressures ranging from 0 to 15 MPa using either axial strain-rate or axial stress-rate control and show that the failure strength of concrete increases with confining pressure which can be adequately described through pressure-sensitive failure criteria. Axial, radial, and volumetric strains were also measured during each test and these data were used to determine elastic properties. Experimental results are applicable in the design and analysis of scale-related functions and apply to other concrete structures subjected to compressive loadings such as dams and prestressed structural members

  2. Serum albumin--a non-saturable carrier

    DEFF Research Database (Denmark)

    Brodersen, R; Honoré, B; Larsen, F G

    1984-01-01

    The shape of binding isotherms for sixteen ligands to human serum albumin showed no signs of approaching saturation at high ligand concentrations. It is suggested that ligand binding to serum albumin is essentially different from saturable binding of substrates to enzymes, of oxygen to haemoglobi...

  3. Saturation and forward jets at HERA

    International Nuclear Information System (INIS)

    Marquet, C.; Peschanski, R.; Royon, C.

    2004-01-01

    We analyse forward-jet production at HERA in the framework of the Golec-Biernat and Wusthoff saturation models. We obtain a good description of the forward-jet cross-sections measured by the H1 and ZEUS Collaborations in the two-hard-scale region (k T∼ Q >> Λ QCD ) with two different parametrizations with either significant or weak saturation effects. The weak saturation parametrization gives a scale compatible with the one found for the proton structure function F2. We argue that Mueller-Navelet jets at the Tevatron and the LHC could help distinguishing between both options

  4. Correcting saturation of detectors for particle/droplet imaging methods

    International Nuclear Information System (INIS)

    Kalt, Peter A M

    2010-01-01

    Laser-based diagnostic methods are being applied to more and more flows of theoretical and practical interest and are revealing interesting new flow features. Imaging particles or droplets in nephelometry and laser sheet dropsizing methods requires a trade-off of maximized signal-to-noise ratio without over-saturating the detector. Droplet and particle imaging results in lognormal distribution of pixel intensities. It is possible to fit a derived lognormal distribution to the histogram of measured pixel intensities. If pixel intensities are clipped at a saturated value, it is possible to estimate a presumed probability density function (pdf) shape without the effects of saturation from the lognormal fit to the unsaturated histogram. Information about presumed shapes of the pixel intensity pdf is used to generate corrections that can be applied to data to account for saturation. The effects of even slight saturation are shown to be a significant source of error on the derived average. The influence of saturation on the derived root mean square (rms) is even more pronounced. It is found that errors on the determined average exceed 5% when the number of saturated samples exceeds 3% of the total. Errors on the rms are 20% for a similar saturation level. This study also attempts to delineate limits, within which the detector saturation can be accurately corrected. It is demonstrated that a simple method for reshaping the clipped part of the pixel intensity histogram makes accurate corrections to account for saturated pixels. These outcomes can be used to correct a saturated signal, quantify the effect of saturation on a derived average and offer a method to correct the derived average in the case of slight to moderate saturation of pixels

  5. GiA Roots: software for the high throughput analysis of plant root system architecture

    Science.gov (United States)

    2012-01-01

    Background Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. Results We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. Conclusions We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis. PMID:22834569

  6. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  7. A high-throughput, multi-channel photon-counting detector with picosecond timing

    International Nuclear Information System (INIS)

    Lapington, J.S.; Fraser, G.W.; Miller, G.M.; Ashton, T.J.R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  8. Ultrafast THz Saturable Absorption in Semiconductors

    DEFF Research Database (Denmark)

    Turchinovich, Dmitry; Hoffmann, Matthias C.

    2011-01-01

    We demonstrate THz saturable absorption in n-doped semiconductors GaAs, GaP, and Ge in a nonlinear THz time-domain spectroscopy experiment. Saturable absorption is caused by sample conductivity modulation due to electron heating and satellite valley scattering in the field of a strong THz pulse....

  9. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  10. Crystal Symmetry Algorithms in a High-Throughput Framework for Materials

    Science.gov (United States)

    Taylor, Richard

    The high-throughput framework AFLOW that has been developed and used successfully over the last decade is improved to include fully-integrated software for crystallographic symmetry characterization. The standards used in the symmetry algorithms conform with the conventions and prescriptions given in the International Tables of Crystallography (ITC). A standard cell choice with standard origin is selected, and the space group, point group, Bravais lattice, crystal system, lattice system, and representative symmetry operations are determined. Following the conventions of the ITC, the Wyckoff sites are also determined and their labels and site symmetry are provided. The symmetry code makes no assumptions on the input cell orientation, origin, or reduction and has been integrated in the AFLOW high-throughput framework for materials discovery by adding to the existing code base and making use of existing classes and functions. The software is written in object-oriented C++ for flexibility and reuse. A performance analysis and examination of the algorithms scaling with cell size and symmetry is also reported.

  11. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  12. High Throughput Determinations of Critical Dosing Parameters (IVIVE workshop)

    Science.gov (United States)

    High throughput toxicokinetics (HTTK) is an approach that allows for rapid estimations of TK for hundreds of environmental chemicals. HTTK-based reverse dosimetry (i.e, reverse toxicokinetics or RTK) is used in order to convert high throughput in vitro toxicity screening (HTS) da...

  13. Comparison of Atmospheric Pressure Chemical Ionization and Field Ionization Mass Spectrometry for the Analysis of Large Saturated Hydrocarbons.

    Science.gov (United States)

    Jin, Chunfen; Viidanoja, Jyrki; Li, Mingzhe; Zhang, Yuyang; Ikonen, Elias; Root, Andrew; Romanczyk, Mark; Manheim, Jeremy; Dziekonski, Eric; Kenttämaa, Hilkka I

    2016-11-01

    Direct infusion atmospheric pressure chemical ionization mass spectrometry (APCI-MS) was compared to field ionization mass spectrometry (FI-MS) for the determination of hydrocarbon class distributions in lubricant base oils. When positive ion mode APCI with oxygen as the ion source gas was employed to ionize saturated hydrocarbon model compounds (M) in hexane, only stable [M - H] + ions were produced. Ion-molecule reaction studies performed in a linear quadrupole ion trap suggested that fragment ions of ionized hexane can ionize saturated hydrocarbons via hydride abstraction with minimal fragmentation. Hence, APCI-MS shows potential as an alternative of FI-MS in lubricant base oil analysis. Indeed, the APCI-MS method gave similar average molecular weights and hydrocarbon class distributions as FI-MS for three lubricant base oils. However, the reproducibility of APCI-MS method was found to be substantially better than for FI-MS. The paraffinic content determined using the APCI-MS and FI-MS methods for the base oils was similar. The average number of carbons in paraffinic chains followed the same increasing trend from low viscosity to high viscosity base oils for the two methods.

  14. Experimental study on distributed optical fiber-based approach monitoring saturation line in levee engineering

    Science.gov (United States)

    Su, Huaizhi; Li, Hao; Kang, Yeyuan; Wen, Zhiping

    2018-02-01

    Seepage is one of key factors which affect the levee engineering safety. The seepage danger without timely detection and rapid response may likely lead to severe accidents such as seepage failure, slope instability, and even levee break. More than 90 percent of levee break events are caused by the seepage. It is very important for seepage behavior identification to determine accurately saturation line in levee engineering. Furthermore, the location of saturation line has a major impact on slope stability in levee engineering. Considering the structure characteristics and service condition of levee engineering, the distributed optical fiber sensing technology is introduced to implement the real-time observation of saturation line in levee engineering. The distributed optical fiber temperature sensor system (DTS)-based monitoring principle of saturation line in levee engineering is investigated. An experimental platform, which consists of DTS, heating system, water-supply system, auxiliary analysis system and levee model, is designed and constructed. The monitoring experiment of saturation line in levee model is implemented on this platform. According to the experimental results, the numerical relationship between moisture content and thermal conductivity in porous medium is identified. A line heat source-based distributed optical fiber method obtaining the thermal conductivity in porous medium is developed. A DTS-based approach is proposed to monitor the saturation line in levee engineering. The embedment pattern of optical fiber for monitoring saturation line is presented.

  15. Ontology-based meta-analysis of global collections of high-throughput public data.

    Directory of Open Access Journals (Sweden)

    Ilya Kupershmidt

    2010-09-01

    Full Text Available The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today.We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets.Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  16. ZebraZoom: an automated program for high-throughput behavioral analysis and categorization

    Science.gov (United States)

    Mirat, Olivier; Sternberg, Jenna R.; Severi, Kristen E.; Wyart, Claire

    2013-01-01

    The zebrafish larva stands out as an emergent model organism for translational studies involving gene or drug screening thanks to its size, genetics, and permeability. At the larval stage, locomotion occurs in short episodes punctuated by periods of rest. Although phenotyping behavior is a key component of large-scale screens, it has not yet been automated in this model system. We developed ZebraZoom, a program to automatically track larvae and identify maneuvers for many animals performing discrete movements. Our program detects each episodic movement and extracts large-scale statistics on motor patterns to produce a quantification of the locomotor repertoire. We used ZebraZoom to identify motor defects induced by a glycinergic receptor antagonist. The analysis of the blind mutant atoh7 revealed small locomotor defects associated with the mutation. Using multiclass supervised machine learning, ZebraZoom categorized all episodes of movement for each larva into one of three possible maneuvers: slow forward swim, routine turn, and escape. ZebraZoom reached 91% accuracy for categorization of stereotypical maneuvers that four independent experimenters unanimously identified. For all maneuvers in the data set, ZebraZoom agreed with four experimenters in 73.2–82.5% of cases. We modeled the series of maneuvers performed by larvae as Markov chains and observed that larvae often repeated the same maneuvers within a group. When analyzing subsequent maneuvers performed by different larvae, we found that larva–larva interactions occurred as series of escapes. Overall, ZebraZoom reached the level of precision found in manual analysis but accomplished tasks in a high-throughput format necessary for large screens. PMID:23781175

  17. Ontology-based meta-analysis of global collections of high-throughput public data.

    Science.gov (United States)

    Kupershmidt, Ilya; Su, Qiaojuan Jane; Grewal, Anoop; Sundaresh, Suman; Halperin, Inbal; Flynn, James; Shekar, Mamatha; Wang, Helen; Park, Jenny; Cui, Wenwu; Wall, Gregory D; Wisotzkey, Robert; Alag, Satnam; Akhtari, Saeid; Ronaghi, Mostafa

    2010-09-29

    The investigation of the interconnections between the molecular and genetic events that govern biological systems is essential if we are to understand the development of disease and design effective novel treatments. Microarray and next-generation sequencing technologies have the potential to provide this information. However, taking full advantage of these approaches requires that biological connections be made across large quantities of highly heterogeneous genomic datasets. Leveraging the increasingly huge quantities of genomic data in the public domain is fast becoming one of the key challenges in the research community today. We have developed a novel data mining framework that enables researchers to use this growing collection of public high-throughput data to investigate any set of genes or proteins. The connectivity between molecular states across thousands of heterogeneous datasets from microarrays and other genomic platforms is determined through a combination of rank-based enrichment statistics, meta-analyses, and biomedical ontologies. We address data quality concerns through dataset replication and meta-analysis and ensure that the majority of the findings are derived using multiple lines of evidence. As an example of our strategy and the utility of this framework, we apply our data mining approach to explore the biology of brown fat within the context of the thousands of publicly available gene expression datasets. Our work presents a practical strategy for organizing, mining, and correlating global collections of large-scale genomic data to explore normal and disease biology. Using a hypothesis-free approach, we demonstrate how a data-driven analysis across very large collections of genomic data can reveal novel discoveries and evidence to support existing hypothesis.

  18. Comparative analysis of the apparent saturation hysteresis approach and the domain theory of hysteresis in respect of prediction of scanning curves and air entrapment

    Science.gov (United States)

    Beriozkin, A.; Mualem, Y.

    2018-05-01

    This study theoretically analyzes the concept of apparent saturation hysteresis, combined with the Scott et al. (1983) scaling approach, as suggested by Parker and Lenhard (1987), to account for the effect of air entrapment and release on the soil water hysteresis. We found that the theory of Parker and Lenhard (1987) is comprised of some mutually canceling mathematical operations, and when cleared of the superfluous intermediate calculations, their model reduces to the original Scott et al.'s (1983) scaling method, supplemented with the requirement of closure of scanning loops. Our analysis reveals that actually there is no effect of their technique of accounting for the entrapped air on the final prediction of the effective saturation (or water content) scanning curves. Our consideration indicates that the use of the Land (1968) formula for assessing the amount of entrapped air is in disaccord with the apparent saturation concept as introduced by Parker and Lenhard (1987). In this paper, a proper routine is suggested for predicting hysteretic scanning curves of any order, given the two measured main curves, in the complete hysteretic domain and some verification tests are carried out versus measured results. Accordingly, explicit closed-form formulae for direct prediction (with no need of intermediate calculation) of scanning curves up to the third order are derived to sustain our analysis.

  19. Titration calorimetry of surfactant–drug interactions: Micelle formation and saturation studies

    International Nuclear Information System (INIS)

    Waters, Laura J.; Hussain, Talib; Parkes, Gareth M.B.

    2012-01-01

    Highlights: ► Isothermal titration calorimetry can be used to monitor the saturation of micelles with pharmaceutical compounds. ► The number of drug molecules per micelle varies depending on the drug used and the temperature of the calorimeter. ► The change in enthalpy for the saturation of micelles with drugs can be endothermic or exothermic. ► The critical micellar concentration of an anionic surfactant (SDS) does not appear to vary in the presence of drugs. - Abstract: Isothermal titration calorimetry (ITC) was employed to monitor the addition of five model drugs to anionic surfactant based micelles, composed of sodium dodecyl sulfate (SDS), through to the point at which they were saturated with drug. Analysis of the resultant data using this newly developed method has confirmed the suitability of the technique to acquire such data with saturation limits established in all cases. Values for the point at which saturation occurred ranged from 17 molecules of theophylline per micelle at T = 298 K up to 63 molecules of caffeine per micelle at 310 K. Micellar systems can be disrupted by the presence of additional chemicals, such as the drugs used in this study, therefore a separate investigation was undertaken to determine the critical micellar concentration (CMC) for SDS in the presence of each drug at T = 298 K and 310 K using ITC. In the majority of cases, there was no appreciable alteration to the CMC of SDS with drug present.

  20. High-Throughput Functional Screening of Steroid Substrates with Wild-Type and Chimeric P450 Enzymes

    Directory of Open Access Journals (Sweden)

    Philippe Urban

    2014-01-01

    Full Text Available The promiscuity of a collection of enzymes consisting of 31 wild-type and synthetic variants of CYP1A enzymes was evaluated using a series of 14 steroids and 2 steroid-like chemicals, namely, nootkatone, a terpenoid, and mifepristone, a drug. For each enzyme-substrate couple, the initial steady-state velocity of metabolite formation was determined at a substrate saturating concentration. For that, a high-throughput approach was designed involving automatized incubations in 96-well microplate with sixteen 6-point kinetics per microplate and data acquisition using LC/MS system accepting 96-well microplate for injections. The resulting dataset was used for multivariate statistics aimed at sorting out the correlations existing between tested enzyme variants and ability to metabolize steroid substrates. Functional classifications of both CYP1A enzyme variants and steroid substrate structures were obtained allowing the delineation of global structural features for both substrate recognition and regioselectivity of oxidation.

  1. Electrical conductivity modeling in fractal non-saturated porous media

    Science.gov (United States)

    Wei, W.; Cai, J.; Hu, X.; Han, Q.

    2016-12-01

    The variety of electrical conductivity in non-saturated conditions is important to study electric conduction in natural sedimentary rocks. The electrical conductivity in completely saturated porous media is a porosity-function representing the complex connected behavior of single conducting phases (pore fluid). For partially saturated conditions, the electrical conductivity becomes even more complicated since the connectedness of pore. Archie's second law is an empirical electrical conductivity-porosity and -saturation model that has been used to predict the formation factor of non-saturated porous rock. However, the physical interpretation of its parameters, e.g., the cementation exponent m and the saturation exponent n, remains questionable. On basis of our previous work, we combine the pore-solid fractal (PSF) model to build an electrical conductivity model in non-saturated porous media. Our theoretical porosity- and saturation-dependent models contain endmember properties, such as fluid electrical conductivities, pore fractal dimension and tortuosity fractal dimension (representing the complex degree of electrical flowing path). We find the presented model with non-saturation-dependent electrical conductivity datasets indicate excellent match between theory and experiments. This means the value of pore fractal dimension and tortuosity fractal dimension change from medium to medium and depends not only on geometrical properties of pore structure but also characteristics of electrical current flowing in the non-saturated porous media.

  2. Throughput Capacity of Ad Hoc Networks with Route Discovery

    Directory of Open Access Journals (Sweden)

    Blum Rick S

    2007-01-01

    Full Text Available Throughput capacity of large ad hoc networks has been shown to scale adversely with the size of network . However the need for the nodes to find or repair routes has not been analyzed in this context. In this paper, we explicitly take route discovery into account and obtain the scaling law for the throughput capacity under general assumptions on the network environment, node behavior, and the quality of route discovery algorithms. We also discuss a number of possible scenarios and show that the need for route discovery may change the scaling for the throughput capacity.

  3. High Throughput In vivo Analysis of Plant Leaf Chemical Properties Using Hyperspectral Imaging

    Directory of Open Access Journals (Sweden)

    Piyush Pandey

    2017-08-01

    Full Text Available Image-based high-throughput plant phenotyping in greenhouse has the potential to relieve the bottleneck currently presented by phenotypic scoring which limits the throughput of gene discovery and crop improvement efforts. Numerous studies have employed automated RGB imaging to characterize biomass and growth of agronomically important crops. The objective of this study was to investigate the utility of hyperspectral imaging for quantifying chemical properties of maize and soybean plants in vivo. These properties included leaf water content, as well as concentrations of macronutrients nitrogen (N, phosphorus (P, potassium (K, magnesium (Mg, calcium (Ca, and sulfur (S, and micronutrients sodium (Na, iron (Fe, manganese (Mn, boron (B, copper (Cu, and zinc (Zn. Hyperspectral images were collected from 60 maize and 60 soybean plants, each subjected to varying levels of either water deficit or nutrient limitation stress with the goal of creating a wide range of variation in the chemical properties of plant leaves. Plants were imaged on an automated conveyor belt system using a hyperspectral imager with a spectral range from 550 to 1,700 nm. Images were processed to extract reflectance spectrum from each plant and partial least squares regression models were developed to correlate spectral data with chemical data. Among all the chemical properties investigated, water content was predicted with the highest accuracy [R2 = 0.93 and RPD (Ratio of Performance to Deviation = 3.8]. All macronutrients were also quantified satisfactorily (R2 from 0.69 to 0.92, RPD from 1.62 to 3.62, with N predicted best followed by P, K, and S. The micronutrients group showed lower prediction accuracy (R2 from 0.19 to 0.86, RPD from 1.09 to 2.69 than the macronutrient groups. Cu and Zn were best predicted, followed by Fe and Mn. Na and B were the only two properties that hyperspectral imaging was not able to quantify satisfactorily (R2 < 0.3 and RPD < 1.2. This study suggested

  4. Determination of diagnostic standards on saturated soil extracts for cut roses grown in greenhouses.

    Science.gov (United States)

    Franco-Hermida, John Jairo; Quintero, María Fernanda; Cabrera, Raúl Iskander; Guzman, José Miguel

    2017-01-01

    This work comprises the theoretical determination and validation of diagnostic standards for the analysis of saturated soil extracts for cut rose flower crops (Rosa spp.) growing in the Bogota Plateau, Colombia. The data included 684 plant tissue analyses and 684 corresponding analyses of saturated soil extracts, all collected between January 2009 and June 2013. The tissue and soil samples were selected from 13 rose farms, and from cultivars grafted on the 'Natal Briar' rootstock. These concurrent samples of soil and plant tissues represented 251 production units (locations) of approximately 10,000 m2 distributed across the study area. The standards were conceived as a tool to improve the nutritional balance in the leaf tissue of rose plants and thereby define the norms for expressing optimum productive potential relative to nutritional conditions in the soil. To this end, previously determined diagnostic standard for rose leaf tissues were employed to obtain rates of foliar nutritional balance at each analyzed location and as criteria for determining the diagnostic norms for saturated soil extracts. Implementing this methodology to foliar analysis, showed a higher significant correlation for diagnostic indices. A similar behavior was observed in saturated soil extracts analysis, becoming a powerful tool for integrated nutritional diagnosis. Leaf analyses determine the most limiting nutrients for high yield and analyses of saturated soil extracts facilitate the possibility of correcting the fertigation formulations applied to soils or substrates. Recommendations are proposed to improve the balance in soil-plant system with which the possibility of yield increase becomes more probable. The main recommendations to increase and improve rose crop flower yields would be: continuously check pH values of SSE, reduce the amounts of P, Fe, Zn and Cu in fertigation solutions and carefully analyze the situation of Mn in the soil-plant system.

  5. Site-Scale Saturated Zone Flow Model

    International Nuclear Information System (INIS)

    G. Zyvoloski

    2003-01-01

    The purpose of this model report is to document the components of the site-scale saturated-zone flow model at Yucca Mountain, Nevada, in accordance with administrative procedure (AP)-SIII.lOQ, ''Models''. This report provides validation and confidence in the flow model that was developed for site recommendation (SR) and will be used to provide flow fields in support of the Total Systems Performance Assessment (TSPA) for the License Application. The output from this report provides the flow model used in the ''Site-Scale Saturated Zone Transport'', MDL-NBS-HS-000010 Rev 01 (BSC 2003 [162419]). The Site-Scale Saturated Zone Transport model then provides output to the SZ Transport Abstraction Model (BSC 2003 [164870]). In particular, the output from the SZ site-scale flow model is used to simulate the groundwater flow pathways and radionuclide transport to the accessible environment for use in the TSPA calculations. Since the development and calibration of the saturated-zone flow model, more data have been gathered for use in model validation and confidence building, including new water-level data from Nye County wells, single- and multiple-well hydraulic testing data, and new hydrochemistry data. In addition, a new hydrogeologic framework model (HFM), which incorporates Nye County wells lithology, also provides geologic data for corroboration and confidence in the flow model. The intended use of this work is to provide a flow model that generates flow fields to simulate radionuclide transport in saturated porous rock and alluvium under natural or forced gradient flow conditions. The flow model simulations are completed using the three-dimensional (3-D), finite-element, flow, heat, and transport computer code, FEHM Version (V) 2.20 (software tracking number (STN): 10086-2.20-00; LANL 2003 [161725]). Concurrently, process-level transport model and methodology for calculating radionuclide transport in the saturated zone at Yucca Mountain using FEHM V 2.20 are being

  6. A bioimage informatics platform for high-throughput embryo phenotyping.

    Science.gov (United States)

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2018-01-01

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org. Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest. © The Author 2016. Published by Oxford University Press.

  7. Relating oxygen partial pressure, saturation and content: the haemoglobin–oxygen dissociation curve

    Directory of Open Access Journals (Sweden)

    Julie-Ann Collins

    2015-09-01

    The delivery of oxygen by arterial blood to the tissues of the body has a number of critical determinants including blood oxygen concentration (content, saturation (SO2 and partial pressure, haemoglobin concentration and cardiac output, including its distribution. The haemoglobin–oxygen dissociation curve, a graphical representation of the relationship between oxygen satur­ation and oxygen partial pressure helps us to understand some of the principles underpinning this process. Historically this curve was derived from very limited data based on blood samples from small numbers of healthy subjects which were manipulated in vitro and ultimately determined by equations such as those described by Severinghaus in 1979. In a study of 3524 clinical specimens, we found that this equation estimated the SO2 in blood from patients with normal pH and SO2 >70% with remarkable accuracy and, to our knowledge, this is the first large-scale validation of this equation using clinical samples. Oxygen saturation by pulse oximetry (SpO2 is nowadays the standard clinical method for assessing arterial oxygen saturation, providing a convenient, pain-free means of continuously assessing oxygenation, provided the interpreting clinician is aware of important limitations. The use of pulse oximetry reduces the need for arterial blood gas analysis (SaO2 as many patients who are not at risk of hypercapnic respiratory failure or metabolic acidosis and have acceptable SpO2 do not necessarily require blood gas analysis. While arterial sampling remains the gold-standard method of assessing ventilation and oxygenation, in those patients in whom blood gas analysis is indicated, arterialised capillary samples also have a valuable role in patient care. The clinical role of venous blood gases however remains less well defined.

  8. High-throughput fractionation of human plasma for fast enrichment of low- and high-abundance proteins.

    Science.gov (United States)

    Breen, Lucas; Cao, Lulu; Eom, Kirsten; Srajer Gajdosik, Martina; Camara, Lila; Giacometti, Jasminka; Dupuy, Damian E; Josic, Djuro

    2012-05-01

    Fast, cost-effective and reproducible isolation of IgM from plasma is invaluable to the study of IgM and subsequent understanding of the human immune system. Additionally, vast amounts of information regarding human physiology and disease can be derived from analysis of the low abundance proteome of the plasma. In this study, methods were optimized for both the high-throughput isolation of IgM from human plasma, and the high-throughput isolation and fractionation of low abundance plasma proteins. To optimize the chromatographic isolation of IgM from human plasma, many variables were examined including chromatography resin, mobile phases, and order of chromatographic separations. Purification of IgM was achieved most successfully through isolation of immunoglobulin from human plasma using Protein A chromatography with a specific resin followed by subsequent fractionation using QA strong anion exchange chromatography. Through these optimization experiments, an additional method was established to prepare plasma for analysis of low abundance proteins. This method involved chromatographic depletion of high-abundance plasma proteins and reduction of plasma proteome complexity through further chromatographic fractionation. Purification of IgM was achieved with high purity as confirmed by SDS-PAGE and IgM-specific immunoblot. Isolation and fractionation of low abundance protein was also performed successfully, as confirmed by SDS-PAGE and mass spectrometry analysis followed by label-free quantitative spectral analysis. The level of purity of the isolated IgM allows for further IgM-specific analysis of plasma samples. The developed fractionation scheme can be used for high throughput screening of human plasma in order to identify low and high abundance proteins as potential prognostic and diagnostic disease biomarkers.

  9. Observability of linear systems with saturated outputs

    NARCIS (Netherlands)

    Koplon, R.; Sontag, E.D.; Hautus, M.L.J.

    1994-01-01

    We present necessary and sufficient conditions for observability of the class of output-saturated systems. These are linear systems whose output passes through a saturation function before it can be measured.

  10. Sector-condition-based results for adaptive control and synchronization of chaotic systems under input saturation

    International Nuclear Information System (INIS)

    Iqbal, Muhammad; Rehan, Muhammad; Hong, Keum-Shik; Khaliq, Abdul; Saeed-ur-Rehman

    2015-01-01

    This paper addresses the design of adaptive feedback controllers for two problems (namely, stabilization and synchronization) of chaotic systems with unknown parameters by considering input saturation constraints. A novel generalized sector condition is developed to deal with the saturation nonlinearities for synthesizing the nonlinear and the adaptive controllers for the stabilization and synchronization control objectives. By application of the proposed sector condition and rigorous regional stability analysis, control and adaptation laws are formulated to guarantee local stabilization of a nonlinear system under actuator saturation. Further, simple control and adaptation laws are developed to synchronize two chaotic systems under uncertain parameters and input saturation nonlinearity. Numerical simulation results for Rössler and FitzHugh–Nagumo models are provided to demonstrate the effectiveness of the proposed adaptive stabilization and synchronization control methodologies

  11. High Throughput In Situ XAFS Screening of Catalysts

    International Nuclear Information System (INIS)

    Tsapatsaris, Nikolaos; Beesley, Angela M.; Weiher, Norbert; Tatton, Helen; Schroeder, Sven L. M.; Dent, Andy J.; Mosselmans, Frederick J. W.; Tromp, Moniek; Russu, Sergio; Evans, John; Harvey, Ian; Hayama, Shu

    2007-01-01

    We outline and demonstrate the feasibility of high-throughput (HT) in situ XAFS for synchrotron radiation studies. An XAS data acquisition and control system for the analysis of dynamic materials libraries under control of temperature and gaseous environments has been developed. The system is compatible with the 96-well industry standard and coupled to multi-stream quadrupole mass spectrometry (QMS) analysis of reactor effluents. An automated analytical workflow generates data quickly compared to traditional individual spectrum acquisition and analyses them in quasi-real time using an HT data analysis tool based on IFFEFIT. The system was used for the automated characterization of a library of 91 catalyst precursors containing ternary combinations of Cu, Pt, and Au on γ-Al2O3, and for the in situ characterization of Au catalysts supported on Al2O3 and TiO2

  12. SATURATION OF MAGNETOROTATIONAL INSTABILITY THROUGH MAGNETIC FIELD GENERATION

    International Nuclear Information System (INIS)

    Ebrahimi, F.; Prager, S. C.; Schnack, D. D.

    2009-01-01

    The saturation mechanism of magnetorotational instability (MRI) is examined through analytical quasi-linear theory and through nonlinear computation of a single mode in a rotating disk. We find that large-scale magnetic field is generated through the α-effect (the correlated product of velocity and magnetic field fluctuations) and causes the MRI mode to saturate. If the large-scale plasma flow is allowed to evolve, the mode can also saturate through its flow relaxation. In astrophysical plasmas, for which the flow cannot relax because of gravitational constraints, the mode saturates through field generation only.

  13. Geochip: A high throughput genomic tool for linking community structure to functions

    Energy Technology Data Exchange (ETDEWEB)

    Van Nostrand, Joy D.; Liang, Yuting; He, Zhili; Li, Guanghe; Zhou, Jizhong

    2009-01-30

    GeoChip is a comprehensive functional gene array that targets key functional genes involved in the geochemical cycling of N, C, and P, sulfate reduction, metal resistance and reduction, and contaminant degradation. Studies have shown the GeoChip to be a sensitive, specific, and high-throughput tool for microbial community analysis that has the power to link geochemical processes with microbial community structure. However, several challenges remain regarding the development and applications of microarrays for microbial community analysis.

  14. GxGrare: gene-gene interaction analysis method for rare variants from high-throughput sequencing data.

    Science.gov (United States)

    Kwon, Minseok; Leem, Sangseob; Yoon, Joon; Park, Taesung

    2018-03-19

    With the rapid advancement of array-based genotyping techniques, genome-wide association studies (GWAS) have successfully identified common genetic variants associated with common complex diseases. However, it has been shown that only a small proportion of the genetic etiology of complex diseases could be explained by the genetic factors identified from GWAS. This missing heritability could possibly be explained by gene-gene interaction (epistasis) and rare variants. There has been an exponential growth of gene-gene interaction analysis for common variants in terms of methodological developments and practical applications. Also, the recent advancement of high-throughput sequencing technologies makes it possible to conduct rare variant analysis. However, little progress has been made in gene-gene interaction analysis for rare variants. Here, we propose GxGrare which is a new gene-gene interaction method for the rare variants in the framework of the multifactor dimensionality reduction (MDR) analysis. The proposed method consists of three steps; 1) collapsing the rare variants, 2) MDR analysis for the collapsed rare variants, and 3) detect top candidate interaction pairs. GxGrare can be used for the detection of not only gene-gene interactions, but also interactions within a single gene. The proposed method is illustrated with 1080 whole exome sequencing data of the Korean population in order to identify causal gene-gene interaction for rare variants for type 2 diabetes. The proposed GxGrare performs well for gene-gene interaction detection with collapsing of rare variants. GxGrare is available at http://bibs.snu.ac.kr/software/gxgrare which contains simulation data and documentation. Supported operating systems include Linux and OS X.

  15. CRISPR-Cas9-mediated saturated mutagenesis screen predicts clinical drug resistance with improved accuracy.

    Science.gov (United States)

    Ma, Leyuan; Boucher, Jeffrey I; Paulsen, Janet; Matuszewski, Sebastian; Eide, Christopher A; Ou, Jianhong; Eickelberg, Garrett; Press, Richard D; Zhu, Lihua Julie; Druker, Brian J; Branford, Susan; Wolfe, Scot A; Jensen, Jeffrey D; Schiffer, Celia A; Green, Michael R; Bolon, Daniel N

    2017-10-31

    Developing tools to accurately predict the clinical prevalence of drug-resistant mutations is a key step toward generating more effective therapeutics. Here we describe a high-throughput CRISPR-Cas9-based saturated mutagenesis approach to generate comprehensive libraries of point mutations at a defined genomic location and systematically study their effect on cell growth. As proof of concept, we mutagenized a selected region within the leukemic oncogene BCR-ABL1 Using bulk competitions with a deep-sequencing readout, we analyzed hundreds of mutations under multiple drug conditions and found that the effects of mutations on growth in the presence or absence of drug were critical for predicting clinically relevant resistant mutations, many of which were cancer adaptive in the absence of drug pressure. Using this approach, we identified all clinically isolated BCR-ABL1 mutations and achieved a prediction score that correlated highly with their clinical prevalence. The strategy described here can be broadly applied to a variety of oncogenes to predict patient mutations and evaluate resistance susceptibility in the development of new therapeutics. Published under the PNAS license.

  16. Analysis of a microscale 'Saturation Phase-change Internal Carnot Engine'

    Energy Technology Data Exchange (ETDEWEB)

    Lurie, Eli [School of Mechanical Engineering, Tel Aviv University, Tel Aviv 69978 (Israel); Kribus, Abraham, E-mail: kribus@eng.tau.ac.i [School of Mechanical Engineering, Tel Aviv University, Tel Aviv 69978 (Israel)

    2010-06-15

    A micro heat engine, based on a cavity filled with a stationary working fluid under liquid-vapor saturation conditions and encapsulated by two membranes, is described and analyzed. This engine design is easy to produce using MEMS technologies and is operated with external heating and cooling. The motion of the membranes is controlled such that the internal pressure and temperature are constant during the heat addition and removal processes, and thus the fluid executes a true internal Carnot cycle. A model of this Saturation Phase-change Internal Carnot Engine (SPICE) was developed including thermodynamic, mechanical and heat transfer aspects. The efficiency and maximum power of the engine are derived. The maximum power point is fixed in a three-parameter space, and operation at this point leads to maximum power density that scales with the inverse square of the engine dimension. Inclusion of the finite heat capacity of the engine wall leads to a strong dependence of performance on engine frequency, and the existence of an optimal frequency. Effects of transient reverse heat flow, and 'parasitic heat' that does not participate in the thermodynamic cycle are observed.

  17. Suppression of mode-beating in a saturated hole-coupled FEL oscillator

    International Nuclear Information System (INIS)

    Krishnagopal, S.; Xie, M.; Kim, K.J.

    1992-08-01

    In a hole-coupled resonator, either empty or loaded with a linear FEL gain medium, the phenomenon of mode-degeneracy and mode-beating have been studied. When the magnitudes of the eigenvalues, derived from a linear analysis, are equal for two or more dominant eigenmodes, the system cannot achieve a stable beam-profile. We investigate this phenomenon when a saturated FEL is present within the cavity, thus introducing non-linearity. We use a three-dimensional FEL oscillator code, based on the amplifier code TDA, and show that mode-beating is completely suppressed in the nonlinear saturated regime. We suggest a simple, qualitative model for the mechanism responsible for this suppression

  18. High-throughput screening (HTS) and modeling of the retinoid ...

    Science.gov (United States)

    Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system Presentation at the Retinoids Review 2nd workshop in Brussels, Belgium on the application of high throughput screening and model to the retinoid system

  19. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    International Nuclear Information System (INIS)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert; Gagnon, David; Gjoerup, Ole; Archambault, Jacques; Bullock, Peter A.

    2014-01-01

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication

  20. Analysis of JC virus DNA replication using a quantitative and high-throughput assay

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jong; Phelan, Paul J.; Chhum, Panharith; Bashkenova, Nazym; Yim, Sung; Parker, Robert [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States); Gagnon, David [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Gjoerup, Ole [Molecular Oncology Research Institute, Tufts Medical Center, Boston, MA 02111 (United States); Archambault, Jacques [Institut de Recherches Cliniques de Montreal (IRCM), 110 Pine Avenue West, Montreal, Quebec, Canada H2W 1R7 (Canada); Department of Biochemistry and Molecular Medicine, Université de Montréal, Montréal, Quebec (Canada); Bullock, Peter A., E-mail: Peter.Bullock@tufts.edu [Department of Developmental, Molecular and Chemical Biology, Tufts University School of Medicine, Boston, MA 02111 (United States)

    2014-11-15

    Progressive Multifocal Leukoencephalopathy (PML) is caused by lytic replication of JC virus (JCV) in specific cells of the central nervous system. Like other polyomaviruses, JCV encodes a large T-antigen helicase needed for replication of the viral DNA. Here, we report the development of a luciferase-based, quantitative and high-throughput assay of JCV DNA replication in C33A cells, which, unlike the glial cell lines Hs 683 and U87, accumulate high levels of nuclear T-ag needed for robust replication. Using this assay, we investigated the requirement for different domains of T-ag, and for specific sequences within and flanking the viral origin, in JCV DNA replication. Beyond providing validation of the assay, these studies revealed an important stimulatory role of the transcription factor NF1 in JCV DNA replication. Finally, we show that the assay can be used for inhibitor testing, highlighting its value for the identification of antiviral drugs targeting JCV DNA replication. - Highlights: • Development of a high-throughput screening assay for JCV DNA replication using C33A cells. • Evidence that T-ag fails to accumulate in the nuclei of established glioma cell lines. • Evidence that NF-1 directly promotes JCV DNA replication in C33A cells. • Proof-of-concept that the HTS assay can be used to identify pharmacological inhibitor of JCV DNA replication.

  1. The Danish tax on saturated fat

    DEFF Research Database (Denmark)

    Vallgårda, Signild; Holm, Lotte; Jensen, Jørgen Dejgård

    2015-01-01

    arguments and themes involved in the debates surrounding the introduction and the repeal. SUBJECTS/METHODS: An analysis of parliamentary debates, expert reports and media coverage; key informant interviews; and a review of studies about the effects of the tax on consumer behaviour. RESULTS: A tax......BACKGROUND/OBJECTIVES: Health promoters have repeatedly proposed using economic policy tools, taxes and subsidies, as a means of changing consumer behaviour. As the first country in the world, Denmark introduced a tax on saturated fat in 2011. It was repealed in 2012. In this paper, we present...... indicates that the tax was effective in changing consumer behaviour....

  2. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    Science.gov (United States)

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  3. International business communications via Intelsat K-band transponders

    Science.gov (United States)

    Hagmann, W.; Rhodes, S.; Fang, R.

    This paper discusses how the transponder throughput and the required earth station HPA power in the Intelsat Business Services Network vary as a function of coding rate and required fade margin. The results indicate that transponder throughputs of 40 to 50 Mbit/s are achievable. A comparison of time domain simulation results with results based on a straightforward link analysis shows that the link analysis results may be fairly optimistic if the satellite traveling wave tube amplifier (TWTA) is operated near saturation; however, there is good agreement for large backoffs.

  4. Saturated linkage map construction in Rubus idaeus using genotyping by sequencing and genome-independent imputation

    Directory of Open Access Journals (Sweden)

    Ward Judson A

    2013-01-01

    Full Text Available Abstract Background Rapid development of highly saturated genetic maps aids molecular breeding, which can accelerate gain per breeding cycle in woody perennial plants such as Rubus idaeus (red raspberry. Recently, robust genotyping methods based on high-throughput sequencing were developed, which provide high marker density, but result in some genotype errors and a large number of missing genotype values. Imputation can reduce the number of missing values and can correct genotyping errors, but current methods of imputation require a reference genome and thus are not an option for most species. Results Genotyping by Sequencing (GBS was used to produce highly saturated maps for a R. idaeus pseudo-testcross progeny. While low coverage and high variance in sequencing resulted in a large number of missing values for some individuals, a novel method of imputation based on maximum likelihood marker ordering from initial marker segregation overcame the challenge of missing values, and made map construction computationally tractable. The two resulting parental maps contained 4521 and 2391 molecular markers spanning 462.7 and 376.6 cM respectively over seven linkage groups. Detection of precise genomic regions with segregation distortion was possible because of map saturation. Microsatellites (SSRs linked these results to published maps for cross-validation and map comparison. Conclusions GBS together with genome-independent imputation provides a rapid method for genetic map construction in any pseudo-testcross progeny. Our method of imputation estimates the correct genotype call of missing values and corrects genotyping errors that lead to inflated map size and reduced precision in marker placement. Comparison of SSRs to published R. idaeus maps showed that the linkage maps constructed with GBS and our method of imputation were robust, and marker positioning reliable. The high marker density allowed identification of genomic regions with segregation

  5. In vivo detection of hemoglobin oxygen saturation and carboxyhemoglobin saturation with multiwavelength photoacoustic microscopy.

    Science.gov (United States)

    Chen, Zhongjiang; Yang, Sihua; Xing, Da

    2012-08-15

    A method for noninvasively detecting hemoglobin oxygen saturation (SO2) and carboxyhemoglobin saturation (SCO) in subcutaneous microvasculature with multiwavelength photoacoustic microscopy is presented. Blood samples mixed with different concentrations of carboxyhemoglobin were used to test the feasibility and accuracy of photoacoustic microscopy compared with the blood-gas analyzer. Moreover, fixed-point detection of SO2 and SCO in mouse ear was obtained, and the changes from normoxia to carbon monoxide hypoxia were dynamically monitored in vivo. Experimental results demonstrate that multiwavelength photoacoustic microscopy can detect SO2 and SCO, which has future potential clinical applications.

  6. Oxygen Saturation in the Dental Pulp of Maxillary Premolars in Different Age Groups - Part 1.

    Science.gov (United States)

    Estrela, Carlos; Serpa, Giuliano C; Alencar, Ana Helena G; Bruno, Kely F; Barletta, Fernando B; Felippe, Wilson T; Estrela, Cyntia R A; Souza, João B

    2017-01-01

    The aim of this study was to determine oxygen saturation levels in the dental pulp of maxillary premolars in different age groups. A total of 120 human maxillary premolars with normal dental pulps were selected covering the following age groups: 20-24, 25-29, 30-34, 35-39 and 40-44 years (n=24 each group). Oxygen saturation was assessed using pulse oximetry. Analysis of variance was used to assess differences in oxygen saturation levels and Tukey's test was used to identify the age groups that differed from each other. Significance was set at 0.05. Mean oxygen saturation of 120 premolars was 86.20% considering all age groups. Significantly reduced levels were found in the oldest group compared to the other groups: 40 to 44 years - 80.00% vs. 89.71, 87.67, 88.71, and 84.80% for age groups 20-24, 25-29, 30-34, 35-39 years, respectively. The mean oxygen saturation levels were similar between 20 and 39 years of age (86.20%) in the whole sample, but reduced significantly in the 40-44-year age group, suggesting that older patients present lower oxygen saturation results even in the absence of pulp tissue injury.

  7. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    Science.gov (United States)

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  8. Monitor hemoglobin concentration and oxygen saturation in living mouse tail using photoacoustic CT scanner

    Science.gov (United States)

    Liu, Bo; Kruger, Robert; Reinecke, Daniel; Stantz, Keith M.

    2010-02-01

    Purpose: The purpose of this study is to use PCT spectroscopy scanner to monitor the hemoglobin concentration and oxygen saturation change of living mouse by imaging the artery and veins in a mouse tail. Materials and Methods: One mouse tail was scanned using the PCT small animal scanner at the isosbestic wavelength (796nm) to obtain its hemoglobin concentration. Immediately after the scan, the mouse was euthanized and its blood was extracted from the heart. The true hemoglobin concentration was measured using a co-oximeter. Reconstruction correction algorithm to compensate the acoustic signal loss due to the existence of bone structure in the mouse tail was developed. After the correction, the hemoglobin concentration was calculated from the PCT images and compared with co-oximeter result. Next, one mouse were immobilized in the PCT scanner. Gas with different concentrations of oxygen was given to mouse to change the oxygen saturation. PCT tail vessel spectroscopy scans were performed 15 minutes after the introduction of gas. The oxygen saturation values were then calculated to monitor the oxygen saturation change of mouse. Results: The systematic error for hemoglobin concentration measurement was less than 5% based on preliminary analysis. Same correction technique was used for oxygen saturation calculation. After correction, the oxygen saturation level change matches the oxygen volume ratio change of the introduced gas. Conclusion: This living mouse tail experiment has shown that NIR PCT-spectroscopy can be used to monitor the oxygen saturation status in living small animals.

  9. Genome-wide LORE1 retrotransposon mutagenesis and high-throughput insertion detection in Lotus japonicus

    DEFF Research Database (Denmark)

    Urbanski, Dorian Fabian; Malolepszy, Anna; Stougaard, Jens

    2012-01-01

    Insertion mutants facilitate functional analysis of genes, but for most plant species it has been difficult to identify a suitable mutagen and to establish large populations for reverse genetics. The main challenge is developing efficient high-throughput procedures for both mutagenesis and insert......Insertion mutants facilitate functional analysis of genes, but for most plant species it has been difficult to identify a suitable mutagen and to establish large populations for reverse genetics. The main challenge is developing efficient high-throughput procedures for both mutagenesis...... plants. The identified insertions showed that the endogenous LORE1 retrotransposon is well suited for insertion mutagenesis due to its homogenous gene targeting and exonic insertion preference. Since LORE1 transposition occurs in the germline, harvesting seeds from a single founder line and cultivating...... progeny generates a complete mutant population. This ease of LORE1 mutagenesis combined with the efficient FSTpoolit protocol, which exploits 2D pooling, Illumina sequencing, and automated data analysis, allows highly cost-efficient development of a comprehensive reverse genetic resource....

  10. Saturation of bentonite dependent upon temperature

    International Nuclear Information System (INIS)

    Hausmannova, Lucie; Vasicek, Radek

    2010-01-01

    Document available in extended abstract form only. The fundamental idea behind the long-term safe operation of a deep repository is the use of the Multi-barrier system principle. Barriers may well differ according to the type of host rock in which the repository is located. It is assumed that the buffer in the granitic host rock environment will consist of swelling clays which boast the ideal properties for such a function i.e. low permeability, high swelling pressure, self-healing ability etc. all of which are affected primarily by mineralogy and dry density. Water content plays a crucial role in the activation of swelling pressure as well as, subsequently, in the potential self healing of the various contact areas of the numerous buffer components made from bentonite. In the case of a deep repository, a change in water content is not only connected with the possible intake of water from the host rock, but also with its redistribution owing to changes in temperature after the insertion of the heat source (disposal waste package containing spent fuel) into the repository 'nest'. The principal reason for the experimental testing of this high dry density material is the uncertainty with regard to its saturation ability (final water content or the degree of saturation) at higher temperatures. The results of the Mock-Up-CZ experiment showed that when the barrier is constantly supplied with a saturation medium over a long time period the water content in the barrier as well as the degree of saturation settle independently of temperature. The Mock-Up-CZ experiment was performed at temperatures of 30 deg. - 90 deg. C in the barrier; therefore it was decided to experimentally verify this behaviour by means of targeted laboratory tests. A temperature of 110 deg. C was added to the set of experimental temperatures resulting in samples being tested at 25 deg. C, 95 deg. C and 110 deg. C. The degree of saturation is defined as the ratio of pore water volume to pore

  11. Experimental and numerical approaches of the hydro-mechanical behaviour of a quasi-saturated compacted clayey soil

    Directory of Open Access Journals (Sweden)

    Li Zhong-Sen

    2016-01-01

    Full Text Available The present research is funded by the French National Project « TerreDurable », which is dedicated to the study of soils in quasi-saturated conditions (close to saturation for the analysis of stability and settlement of earth structures such as embankment, dams. A global presentation of the drying-wetting test shows the volume change, air entry and soil-water characteristics of the soil at slurry and oven-dried conditions. Unsaturated undrained triaxial test was carried out in order to investigate the variation of pore-water pressure from quasi-saturated domain to saturation. The experimental results of the triaxial test are then modeled using a two-dimensional explicit finite difference program (Flac 2D. A constitutive law developed in the TerreDurable project allows better understanding the behaviour of quasi-saturated soils using the water retention curve of quasi-saturated domain proposed by Boutonnier (2007, 2010. A simple effective stress model is used (Cam Clay by taking into account both the suction and the compressibility of equivalent fluid (water + air. The results from numerical calculation and experimental measurements are compared.

  12. A corresponding states treatment of the liquid-vapor saturation line

    International Nuclear Information System (INIS)

    Srinivasan, K.; Ng, K.C.; Velasco, S.; White, J.A.

    2012-01-01

    Highlights: → Correlations arising from the maxima of products of properties in the coexistence line. → Analysis of maxima along the vapor pressure curve. → Correlations for the maximum of the saturated vapor enthalpy curve. → Prediction of properties of the new low GWP refrigerants HFO 1234yf and HFO 1234ze (E). - Abstract: In this work we analyze correlations for the maxima of products of some liquid-vapor saturation properties. These points define new characteristic properties of each fluid that are shown to exhibit linear correlations with the critical properties. We also demonstrate that some of these properties are well correlated with the acentric factor. An application is made to predict the properties of two new low global warming potential (GWP) refrigerants.

  13. High throughput LC-MS/MS method for the simultaneous analysis of multiple vitamin D analytes in serum.

    Science.gov (United States)

    Jenkinson, Carl; Taylor, Angela E; Hassan-Smith, Zaki K; Adams, John S; Stewart, Paul M; Hewison, Martin; Keevil, Brian G

    2016-03-01

    Recent studies suggest that vitamin D-deficiency is linked to increased risk of common human health problems. To define vitamin D 'status' most routine analytical methods quantify one particular vitamin D metabolite, 25-hydroxyvitamin D3 (25OHD3). However, vitamin D is characterized by complex metabolic pathways, and simultaneous measurement of multiple vitamin D metabolites may provide a more accurate interpretation of vitamin D status. To address this we developed a high-throughput liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to analyse multiple vitamin D analytes, with particular emphasis on the separation of epimer metabolites. A supportive liquid-liquid extraction (SLE) and LC-MS/MS method was developed to quantify 10 vitamin D metabolites as well as separation of an interfering 7α-hydroxy-4-cholesten-3-one (7αC4) isobar (precursor of bile acid), and validated by analysis of human serum samples. In a cohort of 116 healthy subjects, circulating concentrations of 25-hydroxyvitamin D3 (25OHD3), 3-epi-25-hydroxyvitamin D3 (3-epi-25OHD3), 24,25-dihydroxyvitamin D3 (24R,25(OH)2D3), 1,25-dihydroxyvitamin D3 (1α,25(OH)2D3), and 25-hydroxyvitamin D2 (25OHD2) were quantifiable using 220μL of serum, with 25OHD3 and 24R,25(OH)2D3 showing significant seasonal variations. This high-throughput LC-MS/MS method provides a novel strategy for assessing the impact of vitamin D on human health and disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Minimum K_2,3-saturated Graphs

    OpenAIRE

    Chen, Ya-Chen

    2010-01-01

    A graph is K_{2,3}-saturated if it has no subgraph isomorphic to K_{2,3}, but does contain a K_{2,3} after the addition of any new edge. We prove that the minimum number of edges in a K_{2,3}-saturated graph on n >= 5 vertices is sat(n, K_{2,3}) = 2n - 3.

  15. OPERATIONS THROUGHPUT AS A DETERMINANT OF GOLDEN-HOUR IN MASS-GATHERING MEDICINE

    Directory of Open Access Journals (Sweden)

    I. D. Khan

    2017-07-01

    Full Text Available BACKGROUND Golden-hour, a time-tested concept for trauma-care, involves a systems approach encompassing healthcare, logistics, geographical, environmental and temporal variables. Golden-hour paradigm in mass-gathering-medicine such as the Hajj-pilgrimage entwines along healthcare availability, accessibility, efficiency and interoperability; expanding from the patient-centric to public-health centric approach. The realm of mass-gathering-medicine invokes an opportunity for incorporating operations-throughput as a determinant of golden-hour for overall capacity-building and interoperability. METHODS Golden-hour was evaluated during the Indian-Medical-Mission operations for Hajj-2016; which established, operated and coordinated a strategic network of round-the-clock medical operations. Throughput was evaluated as deliverables/time, against established Standard-Operating-Procedures for various clinical, investigation, drug-dispensing and patient-transfer algorithms. Patient encounter-time, waiting-time, turnaround-time were assessed throughout echeloned healthcare under a patient-centric healthcare-delivery model. Dynamic evaluation was carried out to cater for variation and heterogeneity. RESULTS Massive surge of 3,94,013 patients comprising 2,25,103 males (57.1% and 1,68,910 females (42.9% overwhelmed the throughput capacities of outpatient attendance, pharmacy, laboratory, imaging, ambulance, referrals and documentation. There was delay in attendance, suspicion, diagnosis and isolation of patients with communicable infections. The situational-analysis of operations-throughput highlights wasted turnaround-time due to mobilization of medical-team, diverting critical healthcare resources away from emergency situations. CONCLUSION Time being a crucial factor in the complexity of medical-care, operations-throughput remains an important determinant towards interoperability of bottlenecks, thereby being a determinant of golden-hour in mass

  16. Laser-Induced Fluorescence Detection in High-Throughput Screening of Heterogeneous Catalysts and Single Cells Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Su, Hui [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    Laser-induced fluorescence detection is one of the most sensitive detection techniques and it has found enormous applications in various areas. The purpose of this research was to develop detection approaches based on laser-induced fluorescence detection in two different areas, heterogeneous catalysts screening and single cell study. First, the author introduced laser-induced imaging (LIFI) as a high-throughput screening technique for heterogeneous catalysts to explore the use of this high-throughput screening technique in discovery and study of various heterogeneous catalyst systems. This scheme is based on the fact that the creation or the destruction of chemical bonds alters the fluorescence properties of suitably designed molecules. By irradiating the region immediately above the catalytic surface with a laser, the fluorescence intensity of a selected product or reactant can be imaged by a charge-coupled device (CCD) camera to follow the catalytic activity as a function of time and space. By screening the catalytic activity of vanadium pentoxide catalysts in oxidation of naphthalene, they demonstrated LIFI has good detection performance and the spatial and temporal resolution needed for high-throughput screening of heterogeneous catalysts. The sample packing density can reach up to 250 x 250 subunits/cm2 for 40-μm wells. This experimental set-up also can screen solid catalysts via near infrared thermography detection. In the second part of this dissertation, the author used laser-induced native fluorescence coupled with capillary electrophoresis (LINF-CE) and microscope imaging to study the single cell degranulation. On the basis of good temporal correlation with events observed through an optical microscope, they have identified individual peaks in the fluorescence electropherograms as serotonin released from the granular core on contact with the surrounding fluid.

  17. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    Science.gov (United States)

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  18. Development of droplets‐based microfluidic systems for single­‐cell high‐throughput screening

    DEFF Research Database (Denmark)

    Chen, Jun; Jensen, Thomas Glasdam; Godina, Alexei

    2014-01-01

    High-throughput screening (HTS) plays an important role in the development of microbial cell factories. One of the most popular approaches is to use microplates combined with the application of robotics, liquid handling and sophisticated detection methods. However, these workstations require large...... investment, and a logarithmic increase to screen large combinatorial libraries over the decades also makes it gradually out of depth. Here, we are trying to develop a feasible high‐throughput system that uses microfluidics to compartmentalize a single cell for propagation and analysis in monodisperse...... picoliter aqueous droplets surround by an immiscible fluorinated oil phase. Our aim is to use this system to facilitate the screening process for both the biotechnology and food industry....

  19. Seismic analysis for shroud facility in-pile tube and saturated temperature capsules

    International Nuclear Information System (INIS)

    Iimura, Koichi; Yamaura, Takayuki; Ogawa, Mitsuhiro

    2009-07-01

    At Oarai Research and Development Center, Japan Atomic Energy Agency (JAEA), the plan of repairing and refurbishing Japan Materials Testing Reactor (JMTR) has progressed in order to restart JMTR operation in the fiscal 2011. As a part of effective use of JMTR, the neutron irradiation tests of LWR fuels and materials has been planned in order to study their soundness. By using Oarai Shroud Facility (OSF-1) and Fuel Irradiation Facility with the He-3 gas control system for power lamping test using Boiling Water Capsules (BOCA Irradiation Facility), the irradiation tests with power ramping will be carried out to study the soundness of fuel under LWR Transient condition. OSF-1 is the irradiation facility of shroud type that can insert and eject the capsule under reactor operation, and is composed of 'In-pile Tube', 'Cooling system' and 'Capsule exchange system'. BOCA Irradiation Facility is the facility which simulates irradiation environment of LWR, and is composed of 'Boiling water Capsule', 'Capsule control system' and 'Power control system by He-3'. By using Saturated temperature Capsules and the water environment control system, the material irradiation tests under the water chemistry condition of LWR will be carried out to clarify the mechanism of IASCC. In JMTR, these facilities are in service at the present. However, the detailed design for renewal or remodeling was carried out based on the new design condition in order to be correspondent to the irradiation test plan after restart JMTR operation. In this seismic analysis of the detailed design, each equipment classification and operating state were arranged with 'Japanese technical standards of the structure on nuclear facility for test research' and 'Technical guidelines for seismic design of nuclear power plants on current, and then, stress calculation and evaluation were carried out by FEM piping analysis code 'SAP' and structure analysis code 'ABAQUS'. About the stress of the seismic force, it was proven

  20. Fat-saturated post gadolinium T1 imaging of the brain in multiple sclerosis

    International Nuclear Information System (INIS)

    Al-Saeed, Osama; Sheikh, Mehraj; Ismail, Mohammed; Athyal, Reji

    2011-01-01

    Background Magnetic resonance imaging (MRI) is of vital importance in the diagnosis and follow-up of patients with multiple sclerosis (MS). Imaging sequences better demonstrating enhancing lesions can help in detecting active MS plaques. Purpose To evaluate the role of fat-saturated gadolinium-enhanced T1-weighted (T1W) images of the brain in MS and to assess the benefit of performing this additional sequence in the detection of enhancing lesions. Material and Methods In a prospective study over a six-month period, 70 consecutive patients with clinically diagnosed MS were enrolled. These constituted 14 male and 56 female patients between the ages of 21 and 44 years. All the patients underwent brain MRIs on a 1.5 Tesla Magnet. Gadolinium-enhanced T1 images with and without fat saturation were compared and results were recorded and analyzed using a conspicuity score and McNemar test. Results There were a total of 157 lesions detected in 70 patients on post-contrast T1W fat-saturated images compared with 139 lesions seen on the post-contrast T1W fast spin-echo (FSE) images. This was because 18 of the lesions (11.5%) were only seen on the fat-saturated images. In addition, 15 lesions were more conspicuous on the fat saturation sequence (9.5%). The total conspicuity score obtained, including all the lesions, was 2.24 +/-0.60 (SD). Using the two-tailed McNemar test for quantitative analysis, the P value obtained was <0.0001. Conclusion T1W fat-saturated gadolinium-enhanced images show better lesion enhancement than T1W images without fat saturation

  1. Seismic Evaluation of Hydrocarbon Saturation in Deep-Water Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Michael Batzle

    2006-04-30

    During this last period of the ''Seismic Evaluation of Hydrocarbon Saturation in Deep-Water Reservoirs'' project (Grant/Cooperative Agreement DE-FC26-02NT15342), we finalized integration of rock physics, well log analysis, seismic processing, and forward modeling techniques. Most of the last quarter was spent combining the results from the principal investigators and come to some final conclusions about the project. Also much of the effort was directed towards technology transfer through the Direct Hydrocarbon Indicators mini-symposium at UH and through publications. As a result we have: (1) Tested a new method to directly invert reservoir properties, water saturation, Sw, and porosity from seismic AVO attributes; (2) Constrained the seismic response based on fluid and rock property correlations; (3) Reprocessed seismic data from Ursa field; (4) Compared thin layer property distributions and averaging on AVO response; (5) Related pressures and sorting effects on porosity and their influence on DHI's; (6) Examined and compared gas saturation effects for deep and shallow reservoirs; (7) Performed forward modeling using geobodies from deepwater outcrops; (8) Documented velocities for deepwater sediments; (9) Continued incorporating outcrop descriptive models in seismic forward models; (10) Held an open DHI symposium to present the final results of the project; (11) Relations between Sw, porosity, and AVO attributes; (12) Models of Complex, Layered Reservoirs; and (14) Technology transfer Several factors can contribute to limit our ability to extract accurate hydrocarbon saturations in deep water environments. Rock and fluid properties are one factor, since, for example, hydrocarbon properties will be considerably different with great depths (high pressure) when compared to shallow properties. Significant over pressure, on the other hand will make the rocks behave as if they were shallower. In addition to the physical properties, the scale and

  2. Throughput analysis of the IEEE 802.4 token bus standard under heavy load

    Science.gov (United States)

    Pang, Joseph; Tobagi, Fouad

    1987-01-01

    It has become clear in the last few years that there is a trend towards integrated digital services. Parallel to the development of public Integrated Services Digital Network (ISDN) is service integration in the local area (e.g., a campus, a building, an aircraft). The types of services to be integrated depend very much on the specific local environment. However, applications tend to generate data traffic belonging to one of two classes. According to IEEE 802.4 terminology, the first major class of traffic is termed synchronous, such as packetized voice and data generated from other applications with real-time constraints, and the second class is called asynchronous which includes most computer data traffic such as file transfer or facsimile. The IEEE 802.4 token bus protocol which was designed to support both synchronous and asynchronous traffic is examined. The protocol is basically a timer-controlled token bus access scheme. By a suitable choice of the design parameters, it can be shown that access delay is bounded for synchronous traffic. As well, the bandwidth allocated to asynchronous traffic can be controlled. A throughput analysis of the protocol under heavy load with constant channel occupation of synchronous traffic and constant token-passing times is presented.

  3. MultiSense: A Multimodal Sensor Tool Enabling the High-Throughput Analysis of Respiration.

    Science.gov (United States)

    Keil, Peter; Liebsch, Gregor; Borisjuk, Ljudmilla; Rolletschek, Hardy

    2017-01-01

    The high-throughput analysis of respiratory activity has become an important component of many biological investigations. Here, a technological platform, denoted the "MultiSense tool," is described. The tool enables the parallel monitoring of respiration in 100 samples over an extended time period, by dynamically tracking the concentrations of oxygen (O 2 ) and/or carbon dioxide (CO 2 ) and/or pH within an airtight vial. Its flexible design supports the quantification of respiration based on either oxygen consumption or carbon dioxide release, thereby allowing for the determination of the physiologically significant respiratory quotient (the ratio between the quantities of CO 2 released and the O 2 consumed). It requires an LED light source to be mounted above the sample, together with a CCD camera system, adjusted to enable the capture of analyte-specific wavelengths, and fluorescent sensor spots inserted into the sample vial. Here, a demonstration is given of the use of the MultiSense tool to quantify respiration in imbibing plant seeds, for which an appropriate step-by-step protocol is provided. The technology can be easily adapted for a wide range of applications, including the monitoring of gas exchange in any kind of liquid culture system (algae, embryo and tissue culture, cell suspensions, microbial cultures).

  4. High-throughput screening with micro-x-ray fluorescence

    International Nuclear Information System (INIS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-01-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity

  5. An analysis of sodium, total fat and saturated fat contents of packaged food products advertised in Bronx-based supermarket circulars.

    Science.gov (United States)

    Samuel, L; Basch, C H; Ethan, D; Hammond, R; Chiazzese, K

    2014-08-01

    Americans' consumption of sodium, fat, and saturated fat exceed federally recommended limits for these nutrients and has been identified as a preventable leading cause of hypertension and cardiovascular disease. More than 40% of the Bronx population comprises African-Americans, who have increased risk and earlier onset of hypertension and are also genetically predisposed to salt-sensitive hypertension. This study analyzed nutrition information for packaged foods advertised in Bronx-based supermarket circulars. Federally recommended limits for sodium, saturated fat and total fat contents were used to identify foods that were high in these nutrients. The proportion of these products with respect to the total number of packaged foods was calculated. More than a third (35%) and almost a quarter (24%) of the 898 advertised packaged foods were high in saturated fat and sodium respectively. Such foods predominantly included processed meat and fish products, fast foods, meals, entrees and side dishes. Dairy and egg products were the greatest contributors of high saturated fat. Pork and beef products, fast foods, meals, entrees and side dishes had the highest median values for sodium, total fat and saturated fat content. The high proportion of packaged foods that are high in sodium and/or saturated fat promoted through supermarket circulars highlights the need for nutrition education among consumers as well as collaborative public health measures by the food industry, community and government agencies to reduce the amounts of sodium and saturated fat in these products and limit the promotion of foods that are high in these nutrients.

  6. High Throughput Computing Impact on Meta Genomics (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Gore, Brooklin

    2011-10-12

    This presentation includes a brief background on High Throughput Computing, correlating gene transcription factors, optical mapping, genotype to phenotype mapping via QTL analysis, and current work on next gen sequencing.

  7. Scintillation probe with photomultiplier tube saturation indicator

    International Nuclear Information System (INIS)

    Ruch, J.F.; Urban, D.J.

    1996-01-01

    A photomultiplier tube saturation indicator is formed by supplying a supplemental light source, typically an light emitting diode (LED), adjacent to the photomultiplier tube. A switch allows the light source to be activated. The light is forwarded to the photomultiplier tube by an optical fiber. If the probe is properly light tight, then a meter attached to the indicator will register the light from the LED. If the probe is no longer light tight, and the saturation indicator is saturated, no signal will be registered when the LED is activated. 2 figs

  8. A Reference Viral Database (RVDB) To Enhance Bioinformatics Analysis of High-Throughput Sequencing for Novel Virus Detection.

    Science.gov (United States)

    Goodacre, Norman; Aljanahi, Aisha; Nandakumar, Subhiksha; Mikailov, Mike; Khan, Arifa S

    2018-01-01

    Detection of distantly related viruses by high-throughput sequencing (HTS) is bioinformatically challenging because of the lack of a public database containing all viral sequences, without abundant nonviral sequences, which can extend runtime and obscure viral hits. Our reference viral database (RVDB) includes all viral, virus-related, and virus-like nucleotide sequences (excluding bacterial viruses), regardless of length, and with overall reduced cellular sequences. Semantic selection criteria (SEM-I) were used to select viral sequences from GenBank, resulting in a first-generation viral database (VDB). This database was manually and computationally reviewed, resulting in refined, semantic selection criteria (SEM-R), which were applied to a new download of updated GenBank sequences to create a second-generation VDB. Viral entries in the latter were clustered at 98% by CD-HIT-EST to reduce redundancy while retaining high viral sequence diversity. The viral identity of the clustered representative sequences (creps) was confirmed by BLAST searches in NCBI databases and HMMER searches in PFAM and DFAM databases. The resulting RVDB contained a broad representation of viral families, sequence diversity, and a reduced cellular content; it includes full-length and partial sequences and endogenous nonretroviral elements, endogenous retroviruses, and retrotransposons. Testing of RVDBv10.2, with an in-house HTS transcriptomic data set indicated a significantly faster run for virus detection than interrogating the entirety of the NCBI nonredundant nucleotide database, which contains all viral sequences but also nonviral sequences. RVDB is publically available for facilitating HTS analysis, particularly for novel virus detection. It is meant to be updated on a regular basis to include new viral sequences added to GenBank. IMPORTANCE To facilitate bioinformatics analysis of high-throughput sequencing (HTS) data for the detection of both known and novel viruses, we have

  9. 20180311 - High Throughput Transcriptomics: From screening to pathways (SOT 2018)

    Science.gov (United States)

    The EPA ToxCast effort has screened thousands of chemicals across hundreds of high-throughput in vitro screening assays. The project is now leveraging high-throughput transcriptomic (HTTr) technologies to substantially expand its coverage of biological pathways. The first HTTr sc...

  10. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  11. A programmable, scalable-throughput interleaver

    NARCIS (Netherlands)

    Rijshouwer, E.J.C.; Berkel, van C.H.

    2010-01-01

    The interleaver stages of digital communication standards show a surprisingly large variation in throughput, state sizes, and permutation functions. Furthermore, data rates for 4G standards such as LTE-Advanced will exceed typical baseband clock frequencies of handheld devices. Multistream operation

  12. Searching saturation effects in inclusive and exclusive eA processes

    International Nuclear Information System (INIS)

    Goncalves, V.P.

    2016-01-01

    In this contribution we have discussed the search of saturation effects in inclusive and exclusive eA processes. In particular, we present a comparison between the linear and non-linear predictions for the nuclear structure functions as well as for the Deep Virtual Compton Scattering (DVCS) and vector meson production in future eA colliders. These results demonstrated that although the inclusive observables are sensitive to saturation effects, it is not yet possible to draw any firm conclusion concerning the QCD dynamics from inclusive quantities due to the large uncertainty present in the collinear predictions. In contrast, exclusive processes are promising observables to search saturation effects, due to the quadratic dependence on the forward scattering amplitude. In particular, the analysis of the nuclear DVCS and vector meson production demonstrated that the energy dependence of the differential cross sections are strongly modified with the increasing of the atomic mass number and that coherent cross section dominates at small t and the incoherent one at large t. Moreover, the number of dips at small t increases with the atomic number, with the position of the dips being almost independent of the model used to treat the dipole - proton interaction

  13. Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening

    Science.gov (United States)

    William R. Kenealy; Thomas W. Jeffries

    2003-01-01

    High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...

  14. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Directory of Open Access Journals (Sweden)

    Rok Gaber

    2013-11-01

    Full Text Available To effectively fight against the human immunodeficiency virus infection/ acquired immunodeficiency syndrome (HIV/AIDS epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity.

  15. Noninvasive High-Throughput Single-Cell Analysis of HIV Protease Activity Using Ratiometric Flow Cytometry

    Science.gov (United States)

    Gaber, Rok; Majerle, Andreja; Jerala, Roman; Benčina, Mojca

    2013-01-01

    To effectively fight against the human immunodeficiency virus infection/acquired immunodeficiency syndrome (HIV/AIDS) epidemic, ongoing development of novel HIV protease inhibitors is required. Inexpensive high-throughput screening assays are needed to quickly scan large sets of chemicals for potential inhibitors. We have developed a Förster resonance energy transfer (FRET)-based, HIV protease-sensitive sensor using a combination of a fluorescent protein pair, namely mCerulean and mCitrine. Through extensive in vitro characterization, we show that the FRET-HIV sensor can be used in HIV protease screening assays. Furthermore, we have used the FRET-HIV sensor for intracellular quantitative detection of HIV protease activity in living cells, which more closely resembles an actual viral infection than an in vitro assay. We have developed a high-throughput method that employs a ratiometric flow cytometry for analyzing large populations of cells that express the FRET-HIV sensor. The method enables FRET measurement of single cells with high sensitivity and speed and should be used when subpopulation-specific intracellular activity of HIV protease needs to be estimated. In addition, we have used a confocal microscopy sensitized emission FRET technique to evaluate the usefulness of the FRET-HIV sensor for spatiotemporal detection of intracellular HIV protease activity. PMID:24287545

  16. Design of a High-Throughput Biological Crystallography Beamline for Superconducting Wiggler

    International Nuclear Information System (INIS)

    Tseng, P.C.; Chang, C.H.; Fung, H.S.; Ma, C.I.; Huang, L.J.; Jean, Y.C.; Song, Y.F.; Huang, Y.S.; Tsang, K.L.; Chen, C.T.

    2004-01-01

    We are constructing a high-throughput biological crystallography beamline BL13B, which utilizes the radiation generated from a 3.2 Tesla, 32-pole superconducting multipole wiggler, for multi-wavelength anomalous diffraction (MAD), single-wavelength anomalous diffraction (SAD), and other related experiments. This beamline is a standard double crystal monochromator (DCM) x-ray beamline equipped with a collimating mirror (CM) and a focusing mirror (FM). Both the CM and FM are one meter long and made of Si substrate, and the CM is side-cooled by water. Based on detailed thermal analysis, liquid nitrogen (LN2) cooling for both crystals of the DCM has been adopted to optimize the energy resolution and photon beam throughput. This beamline will deliver, through a 100 μm diameter pinhole, photon flux of greater than 1011 photons/sec in the energy range from 6.5 keV to 19 keV, which is comparable to existing protein crystallography beamlines from bending magnet source at high energy storage rings

  17. Application of infrared thermography for temperature distributions in fluid-saturated porous media

    DEFF Research Database (Denmark)

    Imran, Muhammad; Nick, Hamid; Schotting, Ruud J.

    2016-01-01

    is achieved with a combination of invasive sensors which are inserted into the medium and non-invasive thermal sensors in which sensors are not inserted to measure temperatures but it works through the detection of infrared radiation emitted from the surface. Thermocouples of relatively thin diameter are used......Infrared thermography has increasingly gained importance because of environmental and technological advancements of this method and is applied in a variety of disciplines related to non-isothermal flow. However, it has not been used so far for quantitative thermal analysis in saturated porous media....... This article suggests infrared thermographic approach to obtain the entire surface temperature distribution(s) in water-saturated porous media. For this purpose, infrared thermal analysis is applied with in situ calibration for a better understanding of the heat transfer processes in porous media. Calibration...

  18. Quantum saturation of the order parameter and the dynamical soft mode in quartz

    CERN Document Server

    Romero, F J

    2003-01-01

    The temperature evolution of the static order parameter of alpha-quartz and its soft-mode frequencies were determined at temperatures below 300 K. While these parameters follow classic Landau theory at higher temperatures, quantum saturation was found below room temperature with a characteristic quantum temperature of 187 K. A quantitative analysis gave a good agreement with the predictions of a PHI sup 6 model close to the displacive limit and a rather flat dispersion of the soft-mode branch. No indication of any effect of strong mode-mode coupling on the saturation behaviour was observed.

  19. Simulation of the saturation process of a radwaste storage cell

    International Nuclear Information System (INIS)

    Robbe, M.F.; Clouard, A.

    2001-01-01

    This paper presents a simulation of the saturation of the barrier and the plug of a storage cell by the surrounding host rock. Generally speaking, the unsaturated barrier and plug start saturating immediately in the vicinity of the quasi-saturated host rock. Then the saturation front propagates towards the canisters and the symmetry axis. Apart from the part in contact with the plug, the barrier is saturated at about 30 years. The part of the barrier near the plug is saturated around 80 years. If the top of the plug is saturated very soon, the part in the corner near the gallery and the symmetry axis is not completely saturated after 100 years. In the site, we observe a small desaturation during the first month, at the limit with the plug and the barrier, and especially in the corner limited by both FoCa clay pieces. This transient phenomenon may be assigned to the time difference between the immediate suction of water by the unsaturated materials and the delayed water flows coming from the saturated host rock to compensate the water suction. The purpose of this computation was at once to estimate the time necessary for the saturation of the clay layers surrounding the radwaste canisters and to evaluate the hydro-mechanical behaviour of the storage cell during the saturation process. Therefore a mechanical simulation was performed using the present hydraulic results to initiate the mechanical computation. (authors)

  20. Precision production: enabling deterministic throughput for precision aspheres with MRF

    Science.gov (United States)

    Maloney, Chris; Entezarian, Navid; Dumas, Paul

    2017-10-01

    Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.

  1. Use of high-throughput mass spectrometry to elucidate host pathogen interactions in Salmonella

    Energy Technology Data Exchange (ETDEWEB)

    Rodland, Karin D.; Adkins, Joshua N.; Ansong, Charles; Chowdhury, Saiful M.; Manes, Nathan P.; Shi, Liang; Yoon, Hyunjin; Smith, Richard D.; Heffron, Fred

    2008-12-01

    Capabilities in mass spectrometry are evolving rapidly, with recent improvements in sensitivity, data analysis, and most important, from the standpoint of this review, much higher throughput allowing analysis of many samples in a single day. This short review describes how these improvements in mass spectrometry can be used to dissect host-pathogen interactions using Salmonella as a model system. This approach enabled direct identification of the majority of annotated Salmonella proteins, quantitation of expression changes under various in vitro growth conditions, and new insights into virulence and expression of Salmonella proteins within host cell cells. One of the most significant findings is that a very high percentage of the all annotated genes (>20%) in Salmonella are regulated post-transcriptionally. In addition, new and unexpected interactions have been identified for several Salmonella virulence regulators that involve protein-protein interactions, suggesting additional functions of these regulators in coordinating virulence expression. Overall high throughput mass spectrometry provides a new view of pathogen-host interactions emphasizing the protein products and defining how protein interactions determine the outcome of infection.

  2. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  3. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Directory of Open Access Journals (Sweden)

    Karolina Chwialkowska

    2017-11-01

    Full Text Available Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq. We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation

  4. Elastoplastic model for unsaturated, quasi-saturated and fully saturated fine soils

    Directory of Open Access Journals (Sweden)

    Lai Ba Tien

    2016-01-01

    Full Text Available In unsaturated soils, the gaseous phase is commonly assumed to be continuous. This assumption is no more valid at high saturation ratio. In that case, air bubbles and pockets can be trapped in the porous network by the liquid phase and the gas phase becomes discontinuous. This trapped air reduces the apparent compressibility of the pore fluid and affect the mechanical behavior of the soil. Although it is trapped in the pores, its dissolution can take place. Dissolved air can migrate through the pore space, either by following the flow of the fluid or by diffusion. In this context, this paper present a hydro mechanical model that separately considers the kinematics and the mechanical behavior of each fluid species (eg liquid water, dissolved air, gaseous air and the solid matrix. This new model was implemented in a C++ code. Some numerical simulations are performed to demonstrate the ability of this model to reproduce a continuous transition of unsaturated to saturated states.

  5. Analysis of the laminar Newtonian fluid flow through a thin fracture modelled as a fluid-saturated sparsely packed porous medium

    Energy Technology Data Exchange (ETDEWEB)

    Pazanin, Igor [Zagreb Univ. (Croatia). Dept. of Mathematics; Siddheshwar, Pradeep G. [Bangalore Univ., Bengaluru (India). Dept. of Mathematics

    2017-06-01

    In this article we investigate the fluid flow through a thin fracture modelled as a fluid-saturated porous medium. We assume that the fracture has constrictions and that the flow is governed by the prescribed pressure drop between the edges of the fracture. The problem is described by the Darcy-Lapwood-Brinkman model acknowledging the Brinkman extension of the Darcy law as well as the flow inertia. Using asymptotic analysis with respect to the thickness of the fracture, we derive the explicit higher-order approximation for the velocity distribution. We make an error analysis to comment on the order of accuracy of the method used and also to provide rigorous justification for the model.

  6. Saturation Detection-Based Blocking Scheme for Transformer Differential Protection

    Directory of Open Access Journals (Sweden)

    Byung Eun Lee

    2014-07-01

    Full Text Available This paper describes a current differential relay for transformer protection that operates in conjunction with a core saturation detection-based blocking algorithm. The differential current for the magnetic inrush or over-excitation has a point of inflection at the start and end of each saturation period of the transformer core. At these instants, discontinuities arise in the first-difference function of the differential current. The second- and third-difference functions convert the points of inflection into pulses, the magnitudes of which are large enough to detect core saturation. The blocking signal is activated if the third-difference of the differential current is larger than the threshold and is maintained for one cycle. In addition, a method to discriminate between transformer saturation and current transformer (CT saturation is included. The performance of the proposed blocking scheme was compared with that of a conventional harmonic blocking method. The test results indicate that the proposed scheme successfully discriminates internal faults even with CT saturation from the magnetic inrush, over-excitation, and external faults with CT saturation, and can significantly reduce the operating time delay of the relay.

  7. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  8. PRO-QUEST: a rapid assessment method based on progressive saturation for quantifying exchange rates using saturation times in CEST.

    Science.gov (United States)

    Demetriou, Eleni; Tachrount, Mohamed; Zaiss, Moritz; Shmueli, Karin; Golay, Xavier

    2018-03-05

    To develop a new MRI technique to rapidly measure exchange rates in CEST MRI. A novel pulse sequence for measuring chemical exchange rates through a progressive saturation recovery process, called PRO-QUEST (progressive saturation for quantifying exchange rates using saturation times), has been developed. Using this method, the water magnetization is sampled under non-steady-state conditions, and off-resonance saturation is interleaved with the acquisition of images obtained through a Look-Locker type of acquisition. A complete theoretical framework has been set up, and simple equations to obtain the exchange rates have been derived. A reduction of scan time from 58 to 16 minutes has been obtained using PRO-QUEST versus the standard QUEST. Maps of both T 1 of water and B 1 can simply be obtained by repetition of the sequence without off-resonance saturation pulses. Simulations and calculated exchange rates from experimental data using amino acids such as glutamate, glutamine, taurine, and alanine were compared and found to be in good agreement. The PRO-QUEST sequence was also applied on healthy and infarcted rats after 24 hours, and revealed that imaging specificity to ischemic acidification during stroke was substantially increased relative to standard amide proton transfer-weighted imaging. Because of the reduced scan time and insensitivity to nonchemical exchange factors such as direct water saturation, PRO-QUEST can serve as an excellent alternative for researchers and clinicians interested to map pH changes in vivo. © 2018 International Society for Magnetic Resonance in Medicine.

  9. Thermo-economic analysis of recuperated Maisotsenko bottoming cycle using triplex air saturator: Comparative analyses

    International Nuclear Information System (INIS)

    Saghafifar, Mohammad; Omar, Amr; Erfanmoghaddam, Sepehr; Gadalla, Mohamed

    2017-01-01

    Highlights: • Proposing recuperated Maisotsenko bottoming cycle (RMBC) as a new combined cycle. • Introducing triplex air saturator for waste heat recovery application. • Conducting thermodynamic optimization to maximize RMBC thermal efficiency. • Conducting thermo-economic optimization to minimize RMBC cost of electricity. - Abstract: A recently recommended combined cycle power plant is to employ another gas turbine cycle for waste heat recovery as an air bottoming cycle (ABC). There are some studies conducted to improve ABC’s thermodynamic performance utilizing commonly power augmentation methods such as steam/water injection. In particular, it is proposed to employ Maisotsenko gas turbine cycle as a bottoming cycle, i.e. Maisotsenko bottoming cycle (MBC). Due to the promising performance of the MBC configuration, it is decided to investigate a recuperated MBC (RMBC) configuration by recommending the triplex air saturator. In this way, the air saturator consists of three sections. The first section is an indirect evaporative cooler while the other two sections are responsible for heat recovery from the topping and bottoming cycle turbines exhaust. In this paper, thermodynamic and thermo-economic analyses are carried out to study the main merits and demerits of RMBC against MBC configuration. Thermodynamic optimization results indicate that the maximum achievable efficiency for MBC and RMBC incorporation in a simple gas turbine power plant are 39.40% and 44.73%, respectively. Finally, thermo-economic optimization shows that the optimum levelized cost of electricity for MBC and RMBC power plants are 62.922 US$/MWh and 58.154 US$/MWh, respectively.

  10. TECHNIQUES OF EVALUATION OF HEMOGLOBIN OXYGEN SATURATION IN CLINICAL OPHTHALMOLOGY

    Directory of Open Access Journals (Sweden)

    S. Yu. Petrov

    2016-01-01

    Full Text Available Oxygen content in body fluids and tissues is an important indicator of life support functions. A number of ocular pathologies, e.g. glaucoma, are of presumable vascular origin which means altered blood supply and oxygen circulation. Most oxygen is transported in the blood in the association with hemoglobin. When passing through the capillaries, hemoglobin releases oxygen, converting from oxygenated form to deoxygenated form. This process is accompanied by the changes in spectral characteristics of hemoglobin which result in different colors of arterial and venous blood. Photometric technique for the measurement of oxygen saturation in blood is based on the differences in light absorption by different forms of hemoglobin. The measurement of saturation is called oximetry. Pulse oximetry with assessment of tissue oxygenation is the most commonly used method in medicine. The degree of hemoglobin oxygen saturation in the eye blood vessels is the most accessible for noninvasive studies during ophthalmoscopy and informative. Numerous studies showed the importance of this parameter for the diagnosis of retinopathy of various genesis, metabolic status analysis in hyperglycemia, diagnosis and control of treatment of glaucoma and other diseases involving alterations in eye blood supply. The specific method for evaluation of oxygen concentration is the measurement of pressure of oxygen dissolved in the blood, i.e. partial pressure of oxygen. In ophthalmological practice, this parameter is measured in anterior chamber fluid evaluating oxygen level for several ophthalmopathies including different forms of glaucoma, for instillations of hypotensive eye drops as well as in vitreous body near to the optic disc under various levels of intraocular pressure. Currently, monitoring of oxygen saturation in retinal blood vessels, i.e. retinal oximetry, is well developed. This technique is based on the assessment of light absorption by blood depending on

  11. COMPUTER APPROACHES TO WHEAT HIGH-THROUGHPUT PHENOTYPING

    Directory of Open Access Journals (Sweden)

    Afonnikov D.

    2012-08-01

    Full Text Available The growing need for rapid and accurate approaches for large-scale assessment of phenotypic characters in plants becomes more and more obvious in the studies looking into relationships between genotype and phenotype. This need is due to the advent of high throughput methods for analysis of genomes. Nowadays, any genetic experiment involves data on thousands and dozens of thousands of plants. Traditional ways of assessing most phenotypic characteristics (those with reliance on the eye, the touch, the ruler are little effective on samples of such sizes. Modern approaches seek to take advantage of automated phenotyping, which warrants a much more rapid data acquisition, higher accuracy of the assessment of phenotypic features, measurement of new parameters of these features and exclusion of human subjectivity from the process. Additionally, automation allows measurement data to be rapidly loaded into computer databases, which reduces data processing time.In this work, we present the WheatPGE information system designed to solve the problem of integration of genotypic and phenotypic data and parameters of the environment, as well as to analyze the relationships between the genotype and phenotype in wheat. The system is used to consolidate miscellaneous data on a plant for storing and processing various morphological traits and genotypes of wheat plants as well as data on various environmental factors. The system is available at www.wheatdb.org. Its potential in genetic experiments has been demonstrated in high-throughput phenotyping of wheat leaf pubescence.

  12. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    Science.gov (United States)

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots

  13. Evaluating High Throughput Toxicokinetics and Toxicodynamics for IVIVE (WC10)

    Science.gov (United States)

    High-throughput screening (HTS) generates in vitro data for characterizing potential chemical hazard. TK models are needed to allow in vitro to in vivo extrapolation (IVIVE) to real world situations. The U.S. EPA has created a public tool (R package “httk” for high throughput tox...

  14. Saturation of the turbulent dynamo.

    Science.gov (United States)

    Schober, J; Schleicher, D R G; Federrath, C; Bovino, S; Klessen, R S

    2015-08-01

    The origin of strong magnetic fields in the Universe can be explained by amplifying weak seed fields via turbulent motions on small spatial scales and subsequently transporting the magnetic energy to larger scales. This process is known as the turbulent dynamo and depends on the properties of turbulence, i.e., on the hydrodynamical Reynolds number and the compressibility of the gas, and on the magnetic diffusivity. While we know the growth rate of the magnetic energy in the linear regime, the saturation level, i.e., the ratio of magnetic energy to turbulent kinetic energy that can be reached, is not known from analytical calculations. In this paper we present a scale-dependent saturation model based on an effective turbulent resistivity which is determined by the turnover time scale of turbulent eddies and the magnetic energy density. The magnetic resistivity increases compared to the Spitzer value and the effective scale on which the magnetic energy spectrum is at its maximum moves to larger spatial scales. This process ends when the peak reaches a characteristic wave number k☆ which is determined by the critical magnetic Reynolds number. The saturation level of the dynamo also depends on the type of turbulence and differs for the limits of large and small magnetic Prandtl numbers Pm. With our model we find saturation levels between 43.8% and 1.3% for Pm≫1 and between 2.43% and 0.135% for Pm≪1, where the higher values refer to incompressible turbulence and the lower ones to highly compressible turbulence.

  15. Advances in High-Throughput Speed, Low-Latency Communication for Embedded Instrumentation (7th Annual SFAF Meeting, 2012)

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Scott

    2012-06-01

    Scott Jordan on "Advances in high-throughput speed, low-latency communication for embedded instrumentation" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  16. Single-mode saturation of the bump-on-tail instability

    International Nuclear Information System (INIS)

    Simon, A.; Rosenbluth, M.N.

    1976-01-01

    A slightly unstable plasma with only one or a few linear modes unstable is considered. Nonlinear saturation at small amplitudes has been treated by time-asymptotic analysis which is a generalization of the methods of Bogolyubov and co-workers. In this paper the method is applied to instability in a collisionless plasma governed by the vlasov equation. The bump-on-tail instability is considered for a one-dimensional plasma

  17. Analisis Throughput Varian TCP Pada Model Jaringan WiMAX

    Directory of Open Access Journals (Sweden)

    Medi Taruk

    2016-07-01

    Full Text Available Transmission Control Protocol (TCP is a protocol that works at the transport layer of the OSI model. TCP was originally designed more destined for a wired network. However, to meet the need for the development of a very fast network technology based on the needs of the use by the user, it needs further development to the use of TCP on wireless devices. One implementation of a wireless network based on Worldwide Interoperability for Microwave Access (WiMAX network is a model that offers a variety advantage, particularly in terms of access speed. In this case, use NS-2 to see throughput at TCP variants tested, namely TCP-Tahoe, TCP-Reno, TCP-Vegas, and TCP-SACK over WiMAX network model, with few observations scenarios. The first is a look at each of these variants throughput of TCP when only one particular variant of the work in the network. Second observe all variants of TCP throughput at the same time and have the equivalent QoS, but with the possibility of a small congestion based on the capacity of the link is made sufficient. Third observed throughput with multi congestion. In WiMAX network has scheduling services are UGS, rtPS and ertPS using UDP protocol and nrtPS and BE using the TCP Protocol. By using the software network simulator (NS-2 to obtain performance comparison TCP protocol-based services on the WiMAX network with QoS parameters are throughput, packet loss, fairness and time delay.

  18. Next generation platforms for high-throughput bio-dosimetry

    International Nuclear Information System (INIS)

    Repin, Mikhail; Turner, Helen C.; Garty, Guy; Brenner, David J.

    2014-01-01

    Here the general concept of the combined use of plates and tubes in racks compatible with the American National Standards Institute/the Society for Laboratory Automation and Screening microplate formats as the next generation platforms for increasing the throughput of bio-dosimetry assays was described. These platforms can be used at different stages of bio-dosimetry assays starting from blood collection into micro-tubes organised in standardised racks and ending with the cytogenetic analysis of samples in standardised multi-well and multichannel plates. Robotically friendly platforms can be used for different bio-dosimetry assays in minimally equipped laboratories and on cost-effective automated universal biotech systems. (authors)

  19. THE SEARCH FOR SUPER-SATURATION IN CHROMOSPHERIC EMISSION

    International Nuclear Information System (INIS)

    Christian, Damian J.; Arias, Tersi; Mathioudakis, Mihalis; Jess, David B.; Jardine, Moira

    2011-01-01

    We investigate if the super-saturation phenomenon observed at X-ray wavelengths for the corona exists in the chromosphere for rapidly rotating late-type stars. Moderate resolution optical spectra of fast-rotating EUV- and X-ray-selected late-type stars were obtained. Stars in α Per were observed in the northern hemisphere with the Isaac Newton 2.5 m telescope and Intermediate Dispersion Spectrograph. Selected objects from IC 2391 and IC 2602 were observed in the southern hemisphere with the Blanco 4 m telescope and R-C spectrograph at CTIO. Ca II H and K fluxes were measured for all stars in our sample. We find the saturation level for Ca II K at log (L CaK /L bol ) = -4.08. The Ca II K flux does not show a decrease as a function of increased rotational velocity or smaller Rossby number as observed in the X-ray. This lack of 'super-saturation' supports the idea of coronal stripping as the cause of saturation and super-saturation in stellar chromospheres and coronae, but the detailed underlying mechanism is still under investigation.

  20. Cross-phase modulation instability in optical fibres with exponential saturable nonlinearity and high-order dispersion

    International Nuclear Information System (INIS)

    Xian-Qiong, Zhong; An-Ping, Xiang

    2010-01-01

    Utilizing the linear-stability analysis, this paper analytically investigates and calculates the condition and gain spectra of cross-phase modulation instability in optical fibres in the case of exponential saturable nonlinearity and high-order dispersion. The results show that, the modulation instability characteristics here are similar to those of conventional saturable nonlinearity and Kerr nonlinearity. That is to say, when the fourth-order dispersion has the same sign as that of the second-order one, a new gain spectral region called the second one which is far away from the zero point may appear. The existence of the exponential saturable nonlinearity will make the spectral width as well as the peak gain of every spectral region increase with the input powers before decrease. Namely, for every spectral regime, this may lead to a unique value of peak gain and spectral width for two different input powers. In comparison with the case of conventional saturable nonlinearity, however, when the other parameters are the same, the variations of the spectral width and the peak gain with the input powers will be faster in case of exponential saturable nonlinearity. (classical areas of phenomenology)

  1. Recent Advances in Nanobiotechnology and High-Throughput Molecular Techniques for Systems Biomedicine

    Science.gov (United States)

    Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho

    2013-01-01

    Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011

  2. Lower early postnatal oxygen saturation target and risk of ductus arteriosus closure failure.

    Science.gov (United States)

    Inomata, Kei; Taniguchi, Shinji; Yonemoto, Hiroki; Inoue, Takeshi; Kawase, Akihiko; Kondo, Yuichi

    2016-11-01

    Early postnatal hyperoxia is a major risk factor for retinopathy of prematurity (ROP) in extremely premature infants. To reduce the occurrence of ROP, we adopted a lower early postnatal oxygen saturation (SpO 2 ) target range (85-92%) from April 2011. Lower SpO 2 target range, however, may lead to hypoxemia and an increase in the risk of ductus arteriosus (DA) closure failure. The aim of this study was therefore to determine whether a lower SpO 2 target range, during the early postnatal stage, increases the risk of DA closure failure. Infants born at closure failure in period 2 (21%) was significantly higher than that in period 1 (1%). On multivariate logistic regression analysis, the lower oxygen saturation target range was an independent risk factor for DA closure failure. Lower early postnatal oxygen saturation target range increases the risk of DA closure failure. © 2016 Japan Pediatric Society.

  3. Comparison of driver behaviour and saturation flow in China and the Netherlands

    NARCIS (Netherlands)

    Jie, L.; Zuylen, H.J. van; Chen, Y.S.; Lu, R.

    2012-01-01

    In Chinese cities, the poor performance of signalised intersections is one of the causes of urban congestion. The reasons for this have been investigated through a comparative study of the saturation flow characteristics on intersections in three Chinese and two Netherlandish cities. The analysis

  4. Magnetic field saturation in the Riga dynamo experiment.

    Science.gov (United States)

    Gailitis, A; Lielausis, O; Platacis, E; Dement'ev, S; Cifersons, A; Gerbeth, G; Gundrum, T; Stefani, F; Christen, M; Will, G

    2001-04-02

    After the dynamo experiment in November 1999 [A. Gailitis et al., Phys. Rev. Lett. 84, 4365 (2000)] had shown magnetic field self-excitation in a spiraling liquid metal flow, in a second series of experiments emphasis was placed on the magnetic field saturation regime as the next principal step in the dynamo process. The dependence of the strength of the magnetic field on the rotation rate is studied. Various features of the saturated magnetic field are outlined and possible saturation mechanisms are discussed.

  5. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  6. Simulation of coupled flow and mechanical deformation using IMplicit Pressure-Displacement Explicit Saturation (IMPDES) scheme

    KAUST Repository

    El-Amin, Mohamed

    2012-01-01

    The problem of coupled structural deformation with two-phase flow in porous media is solved numerically using cellcentered finite difference (CCFD) method. In order to solve the system of governed partial differential equations, the implicit pressure explicit saturation (IMPES) scheme that governs flow equations is combined with the the implicit displacement scheme. The combined scheme may be called IMplicit Pressure-Displacement Explicit Saturation (IMPDES). The pressure distribution for each cell along the entire domain is given by the implicit difference equation. Also, the deformation equations are discretized implicitly. Using the obtained pressure, velocity is evaluated explicitly, while, using the upwind scheme, the saturation is obtained explicitly. Moreover, the stability analysis of the present scheme has been introduced and the stability condition is determined.

  7. High-sensitivity HLA typing by Saturated Tiling Capture Sequencing (STC-Seq).

    Science.gov (United States)

    Jiao, Yang; Li, Ran; Wu, Chao; Ding, Yibin; Liu, Yanning; Jia, Danmei; Wang, Lifeng; Xu, Xiang; Zhu, Jing; Zheng, Min; Jia, Junling

    2018-01-15

    Highly polymorphic human leukocyte antigen (HLA) genes are responsible for fine-tuning the adaptive immune system. High-resolution HLA typing is important for the treatment of autoimmune and infectious diseases. Additionally, it is routinely performed for identifying matched donors in transplantation medicine. Although many HLA typing approaches have been developed, the complexity, low-efficiency and high-cost of current HLA-typing assays limit their application in population-based high-throughput HLA typing for donors, which is required for creating large-scale databases for transplantation and precision medicine. Here, we present a cost-efficient Saturated Tiling Capture Sequencing (STC-Seq) approach to capturing 14 HLA class I and II genes. The highly efficient capture (an approximately 23,000-fold enrichment) of these genes allows for simplified allele calling. Tests on five genes (HLA-A/B/C/DRB1/DQB1) from 31 human samples and 351 datasets using STC-Seq showed results that were 98% consistent with the known two sets of digitals (field1 and field2) genotypes. Additionally, STC can capture genomic DNA fragments longer than 3 kb from HLA loci, making the library compatible with the third-generation sequencing. STC-Seq is a highly accurate and cost-efficient method for HLA typing which can be used to facilitate the establishment of population-based HLA databases for the precision and transplantation medicine.

  8. The role of meson dynamics in nuclear matter saturation

    International Nuclear Information System (INIS)

    Goncalves, E.

    1988-01-01

    The problem of the saturation of nuclea matter in the non-relativistic limit of the model proposed by J.D. Walecka is studied. In the original context nuclear matter saturation is obtained as a direct consequence of relativistic effects and both scalar and vector mesons are treated statically. In the present work we investigate the effect of the meson dynamics for the saturation using a Born-Oppenheimer approximation for the ground state. An upper limit for the saturation curve of nuclear matter and are able to decide now essential is the relativistic treatment of the nucleons for this problem, is obtained. (author) [pt

  9. Micro-scaled high-throughput digestion of plant tissue samples for multi-elemental analysis

    Directory of Open Access Journals (Sweden)

    Husted Søren

    2009-09-01

    Full Text Available Abstract Background Quantitative multi-elemental analysis by inductively coupled plasma (ICP spectrometry depends on a complete digestion of solid samples. However, fast and thorough sample digestion is a challenging analytical task which constitutes a bottleneck in modern multi-elemental analysis. Additional obstacles may be that sample quantities are limited and elemental concentrations low. In such cases, digestion in small volumes with minimum dilution and contamination is required in order to obtain high accuracy data. Results We have developed a micro-scaled microwave digestion procedure and optimized it for accurate elemental profiling of plant materials (1-20 mg dry weight. A commercially available 64-position rotor with 5 ml disposable glass vials, originally designed for microwave-based parallel organic synthesis, was used as a platform for the digestion. The novel micro-scaled method was successfully validated by the use of various certified reference materials (CRM with matrices rich in starch, lipid or protein. When the micro-scaled digestion procedure was applied on single rice grains or small batches of Arabidopsis seeds (1 mg, corresponding to approximately 50 seeds, the obtained elemental profiles closely matched those obtained by conventional analysis using digestion in large volume vessels. Accumulated elemental contents derived from separate analyses of rice grain fractions (aleurone, embryo and endosperm closely matched the total content obtained by analysis of the whole rice grain. Conclusion A high-throughput micro-scaled method has been developed which enables digestion of small quantities of plant samples for subsequent elemental profiling by ICP-spectrometry. The method constitutes a valuable tool for screening of mutants and transformants. In addition, the method facilitates studies of the distribution of essential trace elements between and within plant organs which is relevant for, e.g., breeding programmes aiming at

  10. Optimization and high-throughput screening of antimicrobial peptides.

    Science.gov (United States)

    Blondelle, Sylvie E; Lohner, Karl

    2010-01-01

    While a well-established process for lead compound discovery in for-profit companies, high-throughput screening is becoming more popular in basic and applied research settings in academia. The development of combinatorial libraries combined with easy and less expensive access to new technologies have greatly contributed to the implementation of high-throughput screening in academic laboratories. While such techniques were earlier applied to simple assays involving single targets or based on binding affinity, they have now been extended to more complex systems such as whole cell-based assays. In particular, the urgent need for new antimicrobial compounds that would overcome the rapid rise of drug-resistant microorganisms, where multiple target assays or cell-based assays are often required, has forced scientists to focus onto high-throughput technologies. Based on their existence in natural host defense systems and their different mode of action relative to commercial antibiotics, antimicrobial peptides represent a new hope in discovering novel antibiotics against multi-resistant bacteria. The ease of generating peptide libraries in different formats has allowed a rapid adaptation of high-throughput assays to the search for novel antimicrobial peptides. Similarly, the availability nowadays of high-quantity and high-quality antimicrobial peptide data has permitted the development of predictive algorithms to facilitate the optimization process. This review summarizes the various library formats that lead to de novo antimicrobial peptide sequences as well as the latest structural knowledge and optimization processes aimed at improving the peptides selectivity.

  11. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    Science.gov (United States)

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop

  12. Interger multiplication with overflow detection or saturation

    Energy Technology Data Exchange (ETDEWEB)

    Schulte, M.J.; Balzola, P.I.; Akkas, A.; Brocato, R.W.

    2000-01-11

    High-speed multiplication is frequently used in general-purpose and application-specific computer systems. These systems often support integer multiplication, where two n-bit integers are multiplied to produce a 2n-bit product. To prevent growth in word length, processors typically return the n least significant bits of the product and a flag that indicates whether or not overflow has occurred. Alternatively, some processors saturate results that overflow to the most positive or most negative representable number. This paper presents efficient methods for performing unsigned or two's complement integer multiplication with overflow detection or saturation. These methods have significantly less area and delay than conventional methods for integer multiplication with overflow detection and saturation.

  13. The Role of Delay and Connectivity in Throughput Reduction of Cooperative Decentralized Wireless Networks

    Directory of Open Access Journals (Sweden)

    Ahmed Alkhayyat

    2015-01-01

    Full Text Available We proposed a multiple relay selection protocol for decentralized wireless networks. The proposed relays selection protocol aims to address three issues: (1 selecting relays within the coverage area of the source and destination to ensure that the relays are positioned one hop away from the destination, (2 ensuring that the best node (best relays with less distance and attenuation from the destination access the channel first, and (3 ensuring that the proposed relays selection is collision-free. Our analysis also considers three important characteristics of decentralized wireless networks that are directly affected by cooperation: delay, connectivity, and throughput. The main goal of this paper is to demonstrate that improving connectivity and increasing number of relays reduce the throughput of cooperative decentralized wireless networks; consequently, a trade-off equation has been derived.

  14. High-throughput tri-colour flow cytometry technique to assess Plasmodium falciparum parasitaemia in bioassays

    DEFF Research Database (Denmark)

    Tiendrebeogo, Regis W; Adu, Bright; Singh, Susheel K

    2014-01-01

    BACKGROUND: Unbiased flow cytometry-based methods have become the technique of choice in many laboratories for high-throughput, accurate assessments of malaria parasites in bioassays. A method to quantify live parasites based on mitotracker red CMXRos was recently described but consistent...... distinction of early ring stages of Plasmodium falciparum from uninfected red blood cells (uRBC) remains a challenge. METHODS: Here, a high-throughput, three-parameter (tri-colour) flow cytometry technique based on mitotracker red dye, the nucleic acid dye coriphosphine O (CPO) and the leucocyte marker CD45...... for enumerating live parasites in bioassays was developed. The technique was applied to estimate the specific growth inhibition index (SGI) in the antibody-dependent cellular inhibition (ADCI) assay and compared to parasite quantification by microscopy and mitotracker red staining. The Bland-Altman analysis...

  15. Multiplex High-Throughput Targeted Proteomic Assay To Identify Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Baud, Anna; Wessely, Frank; Mazzacuva, Francesca; McCormick, James; Camuzeaux, Stephane; Heywood, Wendy E; Little, Daniel; Vowles, Jane; Tuefferd, Marianne; Mosaku, Olukunbi; Lako, Majlinda; Armstrong, Lyle; Webber, Caleb; Cader, M Zameel; Peeters, Pieter; Gissen, Paul; Cowley, Sally A; Mills, Kevin

    2017-02-21

    Induced pluripotent stem cells have great potential as a human model system in regenerative medicine, disease modeling, and drug screening. However, their use in medical research is hampered by laborious reprogramming procedures that yield low numbers of induced pluripotent stem cells. For further applications in research, only the best, competent clones should be used. The standard assays for pluripotency are based on genomic approaches, which take up to 1 week to perform and incur significant cost. Therefore, there is a need for a rapid and cost-effective assay able to distinguish between pluripotent and nonpluripotent cells. Here, we describe a novel multiplexed, high-throughput, and sensitive peptide-based multiple reaction monitoring mass spectrometry assay, allowing for the identification and absolute quantitation of multiple core transcription factors and pluripotency markers. This assay provides simpler and high-throughput classification into either pluripotent or nonpluripotent cells in 7 min analysis while being more cost-effective than conventional genomic tests.

  16. Mobility Effect on Poroelastic Seismic Signatures in Partially Saturated Rocks With Applications in Time-Lapse Monitoring of a Heavy Oil Reservoir

    Science.gov (United States)

    Zhao, Luanxiao; Yuan, Hemin; Yang, Jingkang; Han, De-hua; Geng, Jianhua; Zhou, Rui; Li, Hui; Yao, Qiuliang

    2017-11-01

    Conventional seismic analysis in partially saturated rocks normally lays emphasis on estimating pore fluid content and saturation, typically ignoring the effect of mobility, which decides the ability of fluids moving in the porous rocks. Deformation resulting from a seismic wave in heterogeneous partially saturated media can cause pore fluid pressure relaxation at mesoscopic scale, thereby making the fluid mobility inherently associated with poroelastic reflectivity. For two typical gas-brine reservoir models, with the given rock and fluid properties, the numerical analysis suggests that variations of patchy fluid saturation, fluid compressibility contrast, and acoustic stiffness of rock frame collectively affect the seismic reflection dependence on mobility. In particular, the realistic compressibility contrast of fluid patches in shallow and deep reservoir environments plays an important role in determining the reflection sensitivity to mobility. We also use a time-lapse seismic data set from a Steam-Assisted Gravity Drainage producing heavy oil reservoir to demonstrate that mobility change coupled with patchy saturation possibly leads to seismic spectral energy shifting from the baseline to monitor line. Our workflow starts from performing seismic spectral analysis on the targeted reflectivity interface. Then, on the basis of mesoscopic fluid pressure diffusion between patches of steam and heavy oil, poroelastic reflectivity modeling is conducted to understand the shift of the central frequency toward low frequencies after the steam injection. The presented results open the possibility of monitoring mobility change of a partially saturated geological formation from dissipation-related seismic attributes.

  17. Delayed system control in presence of actuator saturation

    Directory of Open Access Journals (Sweden)

    A. Mahjoub

    2014-09-01

    Full Text Available The paper is introducing a new design method for systems’ controllers with input delay and actuator saturations and focuses on how to force the system output to track a reference input not necessarily saturation-compatible. We propose a new norm based on the way we quantify tracking performance as a function of saturation errors found using the same norm. The newly defined norm is related to signal average power making possible to account for most common reference signals e.g. step, periodic. It is formally shown that, whatever the reference shape and amplitude, the achievable tracking quality is determined by a well defined reference tracking mismatch error. This latter depends on the reference rate and its compatibility with the actuator saturation constraint. In fact, asymptotic output-reference tracking is achieved in the presence of constraint-compatible step-like references.

  18. High-throughput STR analysis for DNA database using direct PCR.

    Science.gov (United States)

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to <3%. In conclusion, the newly developed direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner. © 2013 American Academy of Forensic Sciences Published 2013. This article is a U.S. Government work and is in the public domain in the U.S.A.

  19. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  20. High-throughput metabolic state analysis: The missing link in integrated functional genomics of yeasts

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Moxley, Joel. F; Åkesson, Mats Fredrik

    2005-01-01

    that achieve comparable throughput, effort and cost compared with DNA arrays. Our sample workup method enables simultaneous metabolite measurements throughout central carbon metabolism and amino acid biosynthesis, using a standard GC-MS platform that was optimized for this Purpose. As an implementation proof......-of-concept, we assayed metabolite levels in two yeast strains and two different environmental conditions in the context of metabolic pathway reconstruction. We demonstrate that these differential metabolite level data distinguish among sample types, such as typical metabolic fingerprinting or footprinting. More...

  1. Optimizing MRI Logistics: Prospective Analysis of Performance, Efficiency, and Patient Throughput.

    Science.gov (United States)

    Beker, Kevin; Garces-Descovich, Alejandro; Mangosing, Jason; Cabral-Goncalves, Ines; Hallett, Donna; Mortele, Koenraad J

    2017-10-01

    The objective of this study is to optimize MRI logistics through evaluation of MRI workflow and analysis of performance, efficiency, and patient throughput in a tertiary care academic center. For 2 weeks, workflow data from two outpatient MRI scanners were prospectively collected and stratified by value added to the process (i.e., value-added time, business value-added time, or non-value-added time). Two separate time cycles were measured: the actual MRI process cycle as well as the complete length of patient stay in the department. In addition, the impact and frequency of delays across all observations were measured. A total of 305 MRI examinations were evaluated, including body (34.1%), neurologic (28.9%), musculoskeletal (21.0%), and breast examinations (16.1%). The MRI process cycle lasted a mean of 50.97 ± 24.4 (SD) minutes per examination; the mean non-value-added time was 13.21 ± 18.77 minutes (25.87% of the total process cycle time). The mean length-of-stay cycle was 83.51 ± 33.63 minutes; the mean non-value-added time was 24.33 ± 24.84 minutes (29.14% of the total patient stay). The delay with the highest frequency (5.57%) was IV or port placement, which had a mean delay of 22.82 minutes. The delay with the greatest impact on time was MRI arthrography for which joint injection of contrast medium was necessary but was not accounted for in the schedule (mean delay, 42.2 minutes; frequency, 1.64%). Of 305 patients, 34 (11.15%) did not arrive at or before their scheduled time. Non-value-added time represents approximately one-third of the total MRI process cycle and patient length of stay. Identifying specific delays may expedite the application of targeted improvement strategies, potentially increasing revenue, efficiency, and overall patient satisfaction.

  2. Analysis of products of thymine irradiated by 18O8+ ion beam in N2O saturated aqueous solution

    International Nuclear Information System (INIS)

    Cai Xichen; Wei Zengquan; Li Wenjian; Liang Jianping; Li Qiang

    1999-01-01

    Some methods of capillary gas chromatography, such as GC, GC-MS GC-FT-IR, are used to analyze the products of thymine irradiated by 18 O 8+ ion beam in N 2 O saturated aqueous solution. From the results of GC-MS the molecular weight of products can be determined, and from the results of GC-FT-IR some molecular structure information of products can be obtained. By this way the products, 5,6-Dihydro-thymine, 5-Hydroxyl-5-Methylhydantoin, 5-Hydroxyl-6-Hydro-thymine, 5-Hydro-6-Hydroxyl thymine, 5-Hydroxymethyluracil, Trans-Thymine glycol, Cis-Thymine glycol and dimers are determined without separation of them from samples. Though these products are as same as those products of thymine irradiated by γ rays in N 2 O saturated aqueous solution, the mechanism of thymine irradiated by heavy ion beam in aqueous solution is different from that by γ rays. The main products of thymine irradiated by 18 O 8+ ion beam in N 2 O saturated aqueous solution are hydroxyl adducts at 5-6 band of thymine, while the main products of thymine irradiated by γ ray in N 2 O saturated aqueous solution are dimers of thymine

  3. SNP-PHAGE – High throughput SNP discovery pipeline

    Directory of Open Access Journals (Sweden)

    Cregan Perry B

    2006-10-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs as defined here are single base sequence changes or short insertion/deletions between or within individuals of a given species. As a result of their abundance and the availability of high throughput analysis technologies SNP markers have begun to replace other traditional markers such as restriction fragment length polymorphisms (RFLPs, amplified fragment length polymorphisms (AFLPs and simple sequence repeats (SSRs or microsatellite markers for fine mapping and association studies in several species. For SNP discovery from chromatogram data, several bioinformatics programs have to be combined to generate an analysis pipeline. Results have to be stored in a relational database to facilitate interrogation through queries or to generate data for further analyses such as determination of linkage disequilibrium and identification of common haplotypes. Although these tasks are routinely performed by several groups, an integrated open source SNP discovery pipeline that can be easily adapted by new groups interested in SNP marker development is currently unavailable. Results We developed SNP-PHAGE (SNP discovery Pipeline with additional features for identification of common haplotypes within a sequence tagged site (Haplotype Analysis and GenBank (-dbSNP submissions. This tool was applied for analyzing sequence traces from diverse soybean genotypes to discover over 10,000 SNPs. This package was developed on UNIX/Linux platform, written in Perl and uses a MySQL database. Scripts to generate a user-friendly web interface are also provided with common queries for preliminary data analysis. A machine learning tool developed by this group for increasing the efficiency of SNP discovery is integrated as a part of this package as an optional feature. The SNP-PHAGE package is being made available open source at http://bfgl.anri.barc.usda.gov/ML/snp-phage/. Conclusion SNP-PHAGE provides a bioinformatics

  4. DESIGN OF LOW EPI AND HIGH THROUGHPUT CORDIC CELL TO IMPROVE THE PERFORMANCE OF MOBILE ROBOT

    Directory of Open Access Journals (Sweden)

    P. VELRAJKUMAR

    2014-04-01

    Full Text Available This paper mainly focuses on pass logic based design, which gives an low Energy Per Instruction (EPI and high throughput COrdinate Rotation Digital Computer (CORDIC cell for application of robotic exploration. The basic components of CORDIC cell namely register, multiplexer and proposed adder is designed using pass transistor logic (PTL design. The proposed adder is implemented in bit-parallel iterative CORDIC circuit whereas designed using DSCH2 VLSI CAD tool and their layouts are generated by Microwind 3 VLSI CAD tool. The propagation delay, area and power dissipation are calculated from the simulated results for proposed adder based CORDIC cell. The EPI, throughput and effect of temperature are calculated from generated layout. The output parameter of generated layout is analysed using BSIM4 advanced analyzer. The simulated result of the proposed adder based CORDIC circuit is compared with other adder based CORDIC circuits. From the analysis of these simulated results, it was found that the proposed adder based CORDIC circuit dissipates low power, gives faster response, low EPI and high throughput.

  5. Arterial blood oxygen saturation during blood pressure cuff-induced hypoperfusion

    International Nuclear Information System (INIS)

    Kyriacou, P A; Shafqat, K; Pal, S K

    2007-01-01

    Pulse oximetry has been one of the most significant technological advances in clinical monitoring in the last two decades. Pulse oximetry is a non-invasive photometric technique that provides information about the arterial blood oxygen saturation (SpO 2 ) and heart rate, and has widespread clinical applications. When peripheral perfusion is poor, as in states of hypovolaemia, hypothermia and vasoconstriction, oxygenation readings become unreliable or cease. The problem arises because conventional pulse oximetry sensors must be attached to the most peripheral parts of the body, such as finger, ear or toe, where pulsatile flow is most easily compromised. Pulse oximeters estimate arterial oxygen saturation by shining light at two different wavelengths, red and infrared, through vascular tissue. In this method the ac pulsatile photoplethysmographic (PPG) signal associated with cardiac contraction is assumed to be attributable solely to the arterial blood component. The amplitudes of the red and infrared ac PPG signals are sensitive to changes in arterial oxygen saturation because of differences in the light absorption of oxygenated and deoxygenated haemoglobin at these two wavelengths. From the ratios of these amplitudes, and the corresponding dc photoplethysmographic components, arterial blood oxygen saturation (SpO 2 ) is estimated. Hence, the technique of pulse oximetry relies on the presence of adequate peripheral arterial pulsations, which are detected as photoplethysmographic (PPG) signals. The aim of this study was to investigate the effect of pressure cuff-induced hypoperfusion on photoplethysmographic signals and arterial blood oxygen saturation using a custom made finger blood oxygen saturation PPG/SpO 2 sensor and a commercial finger pulse oximeter. Blood oxygen saturation values from the custom oxygen saturation sensor and a commercial finger oxygen saturation sensor were recorded from 14 healthy volunteers at various induced brachial pressures. Both pulse

  6. Arterial blood oxygen saturation during blood pressure cuff-induced hypoperfusion

    Science.gov (United States)

    Kyriacou, P. A.; Shafqat, K.; Pal, S. K.

    2007-10-01

    Pulse oximetry has been one of the most significant technological advances in clinical monitoring in the last two decades. Pulse oximetry is a non-invasive photometric technique that provides information about the arterial blood oxygen saturation (SpO2) and heart rate, and has widespread clinical applications. When peripheral perfusion is poor, as in states of hypovolaemia, hypothermia and vasoconstriction, oxygenation readings become unreliable or cease. The problem arises because conventional pulse oximetry sensors must be attached to the most peripheral parts of the body, such as finger, ear or toe, where pulsatile flow is most easily compromised. Pulse oximeters estimate arterial oxygen saturation by shining light at two different wavelengths, red and infrared, through vascular tissue. In this method the ac pulsatile photoplethysmographic (PPG) signal associated with cardiac contraction is assumed to be attributable solely to the arterial blood component. The amplitudes of the red and infrared ac PPG signals are sensitive to changes in arterial oxygen saturation because of differences in the light absorption of oxygenated and deoxygenated haemoglobin at these two wavelengths. From the ratios of these amplitudes, and the corresponding dc photoplethysmographic components, arterial blood oxygen saturation (SpO2) is estimated. Hence, the technique of pulse oximetry relies on the presence of adequate peripheral arterial pulsations, which are detected as photoplethysmographic (PPG) signals. The aim of this study was to investigate the effect of pressure cuff-induced hypoperfusion on photoplethysmographic signals and arterial blood oxygen saturation using a custom made finger blood oxygen saturation PPG/SpO2 sensor and a commercial finger pulse oximeter. Blood oxygen saturation values from the custom oxygen saturation sensor and a commercial finger oxygen saturation sensor were recorded from 14 healthy volunteers at various induced brachial pressures. Both pulse

  7. Link Analysis of High Throughput Spacecraft Communication Systems for Future Science Missions

    Science.gov (United States)

    Simons, Rainee N.

    2015-01-01

    NASA's plan to launch several spacecrafts into low Earth Orbit (LEO) to support science missions in the next ten years and beyond requires down link throughput on the order of several terabits per day. The ability to handle such a large volume of data far exceeds the capabilities of current systems. This paper proposes two solutions, first, a high data rate link between the LEO spacecraft and ground via relay satellites in geostationary orbit (GEO). Second, a high data rate direct to ground link from LEO. Next, the paper presents results from computer simulations carried out for both types of links taking into consideration spacecraft transmitter frequency, EIRP, and waveform; elevation angle dependent path loss through Earths atmosphere, and ground station receiver GT.

  8. Droplet electrospray ionization mass spectrometry for high throughput screening for enzyme inhibitors.

    Science.gov (United States)

    Sun, Shuwen; Kennedy, Robert T

    2014-09-16

    High throughput screening (HTS) is important for identifying molecules with desired properties. Mass spectrometry (MS) is potentially powerful for label-free HTS due to its high sensitivity, speed, and resolution. Segmented flow, where samples are manipulated as droplets separated by an immiscible fluid, is an intriguing format for high throughput MS because it can be used to reliably and precisely manipulate nanoliter volumes and can be directly coupled to electrospray ionization (ESI) MS for rapid analysis. In this study, we describe a "MS Plate Reader" that couples standard multiwell plate HTS workflow to droplet ESI-MS. The MS plate reader can reformat 3072 samples from eight 384-well plates into nanoliter droplets segmented by an immiscible oil at 4.5 samples/s and sequentially analyze them by MS at 2 samples/s. Using the system, a label-free screen for cathepsin B modulators against 1280 chemicals was completed in 45 min with a high Z-factor (>0.72) and no false positives (24 of 24 hits confirmed). The assay revealed 11 structures not previously linked to cathepsin inhibition. For even larger scale screening, reformatting and analysis could be conducted simultaneously, which would enable more than 145,000 samples to be analyzed in 1 day.

  9. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  10. Nonlinear acoustics of water-saturated marine sediments

    DEFF Research Database (Denmark)

    Jensen, Leif Bjørnø

    1976-01-01

    Interest in the acoustic qualities of water-saturated marine sediments has increased considerably during recent years. The use of sources of high-intensity sound in oil propsecting, in geophysical and geological studies of bottom and subbottom materials and profiles and recently in marine...... archaeology has emphasized the need of information about the nonlinear acoustic qualities of water-saturated marine sediments. While the acoustic experiments and theoretical investigations hitherto performed have concentrated on a determination of the linear acoustic qualities of water-saturated marine...... sediments, their parameters of nonlinear acoustics are still unexplored. The strong absorption, increasing about linearly with frequency, found in most marine sediments and the occurrence of velocity dispersion by some marine sediments restrict the number of nonlinear acoustic test methods traditionally...

  11. Fast neutron (14 MeV) attenuation analysis in saturated core samples and its application in well logging

    International Nuclear Information System (INIS)

    Amin Attarzadeh; Mohammad Kamal Ghassem Al Askari; Tagy Bayat

    2009-01-01

    To introduce the application of nuclear logging, it is appropriate to provide a motivation for the use of nuclear measurement techniques in well logging. Importance aspects of the geological sciences are for instance grain and porosity structure and porosity volume of the rocks, as well as the transport properties of a fluid in the porous media. Nuclear measurements are, as a rule non-intrusive. Namely, a measurement does not destroy the sample, and it does not interfere with the process to be measured. Also, non- intrusive measurements are often much faster than the radiation methods, and can also be applied in field measurements. A common type of nuclear measurement employs neutron irradiation. It is powerful technique for geophysical analysis. In this research we illustrate the detail of this technique and it's applications to well logging and oil industry. Experiments have been performed to investigate the possibilities of using neutron attenuation measurements to determine water and oil content of rock sample. A beam of 14 MeV neutrons produced by a 150 KV neutron generator was attenuated by different samples and subsequently detected with plastic scintillators NE102 (Fast counter). Each sample was saturated with water and oil. The difference in neutron attenuation between dry and wet samples was compared with the fluid content determined by mass balance of the sample. In this experiment we were able to determine 3% of humidity in standard sample model (SiO 2 ) and estimate porosity in geological samples when saturated with different fluids. (Author)

  12. Saturation and postsaturation phenomena of Rayleigh-Taylor instability with adjacent modes

    International Nuclear Information System (INIS)

    Ikegawa, Tadashi; Nishihara, Katsunobu

    2003-01-01

    A weakly nonlinear theory has been developed for the classical Rayleigh-Taylor instability with a finite bandwidth taken into account self-consistently. The theory includes up to third order nonlinearity, which results in the saturation of linear growth and determines subsequent weakly nonlinear growth. Analytical results are shown to agree fairly well with two-dimensional hydrodynamic simulations. There are generally many local peaks of a perturbation with a finite bandwidth due to the interference of modes. Since a local amplitude is determined from phases among the modes as well as the bandwidth, we have investigated an onset of the linear growth saturation and the subsequent weakly nonlinear growth for different bandwidths and phases. It is shown that the saturation of the linear growth occurs locally, i.e., each of the local maximum amplitudes (LMAs) grows exponentially until it reaches almost the same saturation amplitude. In the random phase case, the root mean square amplitude thus saturates with almost the same amplitude as the LMA, after most of the LMAs have saturated. The saturation amplitude of the LMA is found to be independent of the bandwidth and depends on the Atwood number. We derive a formula of the saturation amplitude of modes based on the results obtained, and discuss its relation with Haan's formula [Phys. Rev. A 39, 5812 (1989)]. The LMAs grow linearly in time after the saturation and their speeds are approximated by the product of the linear growth rate and the saturation amplitude. We investigate the Atwood number dependence of both the saturation amplitude and the weakly nonlinear growth

  13. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  14. glbase: a framework for combining, analyzing and displaying heterogeneous genomic and high-throughput sequencing data

    Directory of Open Access Journals (Sweden)

    Andrew Paul Hutchins

    2014-01-01

    Full Text Available Genomic datasets and the tools to analyze them have proliferated at an astonishing rate. However, such tools are often poorly integrated with each other: each program typically produces its own custom output in a variety of non-standard file formats. Here we present glbase, a framework that uses a flexible set of descriptors that can quickly parse non-binary data files. glbase includes many functions to intersect two lists of data, including operations on genomic interval data and support for the efficient random access to huge genomic data files. Many glbase functions can produce graphical outputs, including scatter plots, heatmaps, boxplots and other common analytical displays of high-throughput data such as RNA-seq, ChIP-seq and microarray expression data. glbase is designed to rapidly bring biological data into a Python-based analytical environment to facilitate analysis and data processing. In summary, glbase is a flexible and multifunctional toolkit that allows the combination and analysis of high-throughput data (especially next-generation sequencing and genome-wide data, and which has been instrumental in the analysis of complex data sets. glbase is freely available at http://bitbucket.org/oaxiom/glbase/.

  15. glbase: a framework for combining, analyzing and displaying heterogeneous genomic and high-throughput sequencing data.

    Science.gov (United States)

    Hutchins, Andrew Paul; Jauch, Ralf; Dyla, Mateusz; Miranda-Saavedra, Diego

    2014-01-01

    Genomic datasets and the tools to analyze them have proliferated at an astonishing rate. However, such tools are often poorly integrated with each other: each program typically produces its own custom output in a variety of non-standard file formats. Here we present glbase, a framework that uses a flexible set of descriptors that can quickly parse non-binary data files. glbase includes many functions to intersect two lists of data, including operations on genomic interval data and support for the efficient random access to huge genomic data files. Many glbase functions can produce graphical outputs, including scatter plots, heatmaps, boxplots and other common analytical displays of high-throughput data such as RNA-seq, ChIP-seq and microarray expression data. glbase is designed to rapidly bring biological data into a Python-based analytical environment to facilitate analysis and data processing. In summary, glbase is a flexible and multifunctional toolkit that allows the combination and analysis of high-throughput data (especially next-generation sequencing and genome-wide data), and which has been instrumental in the analysis of complex data sets. glbase is freely available at http://bitbucket.org/oaxiom/glbase/.

  16. Comparison of empirical models and laboratory saturated hydraulic ...

    African Journals Online (AJOL)

    Numerous methods for estimating soil saturated hydraulic conductivity exist, which range from direct measurement in the laboratory to models that use only basic soil properties. A study was conducted to compare laboratory saturated hydraulic conductivity (Ksat) measurement and that estimated from empirical models.

  17. Comparative analysis of transcriptomes in aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing

    Directory of Open Access Journals (Sweden)

    Taketo Okada

    2016-12-01

    Full Text Available Ephedra plants are taxonomically classified as gymnosperms, and are medicinally important as the botanical origin of crude drugs and as bioresources that contain pharmacologically active chemicals. Here we show a comparative analysis of the transcriptomes of aerial stems and roots of Ephedra sinica based on high-throughput mRNA sequencing by RNA-Seq. De novo assembly of short cDNA sequence reads generated 23,358, 13,373, and 28,579 contigs longer than 200 bases from aerial stems, roots, or both aerial stems and roots, respectively. The presumed functions encoded by these contig sequences were annotated by BLAST (blastx. Subsequently, these contigs were classified based on gene ontology slims, Enzyme Commission numbers, and the InterPro database. Furthermore, comparative gene expression analysis was performed between aerial stems and roots. These transcriptome analyses revealed differences and similarities between the transcriptomes of aerial stems and roots in E. sinica. Deep transcriptome sequencing of Ephedra should open the door to molecular biological studies based on the entire transcriptome, tissue- or organ-specific transcriptomes, or targeted genes of interest.

  18. Laterally orienting C. elegans using geometry at microscale for high-throughput visual screens in neurodegeneration and neuronal development studies.

    Directory of Open Access Journals (Sweden)

    Ivan de Carlos Cáceres

    Full Text Available C. elegans is an excellent model system for studying neuroscience using genetics because of its relatively simple nervous system, sequenced genome, and the availability of a large number of transgenic and mutant strains. Recently, microfluidic devices have been used for high-throughput genetic screens, replacing traditional methods of manually handling C. elegans. However, the orientation of nematodes within microfluidic devices is random and often not conducive to inspection, hindering visual analysis and overall throughput. In addition, while previous studies have utilized methods to bias head and tail orientation, none of the existing techniques allow for orientation along the dorso-ventral body axis. Here, we present the design of a simple and robust method for passively orienting worms into lateral body positions in microfluidic devices to facilitate inspection of morphological features with specific dorso-ventral alignments. Using this technique, we can position animals into lateral orientations with up to 84% efficiency, compared to 21% using existing methods. We isolated six mutants with neuronal development or neurodegenerative defects, showing that our technology can be used for on-chip analysis and high-throughput visual screens.

  19. Bulk elastic wave propagation in partially saturated porous solids

    International Nuclear Information System (INIS)

    Berryman, J.G.; Thigpen, L.; Chin, R.C.Y.

    1988-01-01

    The linear equations of motion that describe the behavior of small disturbances in a porous solid containing both liquid and gas are solved for bulk wave propagation. The equations have been simplified by neglecting effects due to changes in capillary pressure. With this simplifying assumption, the equations reduce to two coupled (vector) equations of the form found in Biot's equations (for full saturation) but with more complicated coefficients. As in fully saturated solids, two shear waves with the same speed but different polarizations exist as do two compressional waves with distinct speeds. Attenuation effects can be enhanced in the partially saturated solid, depending on the distribution of gas in the pore space. Two models of the liquid/gas spatial distribution are considered: a segregated-fluids model and a mixed-fluids model. The two models predict comparable attentuation when the gas saturation is low, but the segregated-fluids model predicts a more rapid roll-off of attenuation as the gas saturation increases

  20. Shearing of saturated clays in rock joints at high confining pressures

    International Nuclear Information System (INIS)

    Wang, C.; Mao, N.

    1979-01-01

    Saturated clays are sheared between rock joints at various pore water pressures and at confining pressures up to 3 kb (300 Mpa). Sliding on these joints is stable. For a given clay, the shear stress required to initiate sliding increases linearly with the effective normal stress across the sliding surface, with a slope of 0.08 +- 0.01 for joints filled with saturated montmorillonite, 0.12 +- 0.01 with saturated chlorite, 0.15 +- 0.01 with saturated kaolinite, and 0.22 +- 0.02 with saturated silty illite. Thus at high confining pressures the shear stress required to initiate sliding on joints filled with saturated clays are very much smaller than that required to initiate sliding on clean rock joints or on joints filled with dry gouge materials. In the crust, saturation of gouge materials along active faults would greatly lower the frictional resistance to faulting and would stabilize fault movement. Different fault behaviors such as stable creep along some faults and intermittent but sudden slip along others may reflect in part different degrees of saturation of fault zones at depth

  1. High throughput, low set-up time reconfigurable linear feedback shift registers

    NARCIS (Netherlands)

    Nas, R.J.M.; Berkel, van C.H.

    2010-01-01

    This paper presents a hardware design for a scalable, high throughput, configurable LFSR. High throughput is achieved by producing L consecutive outputs per clock cycle with a clock cycle period that, for practical cases, increases only logarithmically with the block size L and the length of the

  2. A High-Throughput, Precipitating Colorimetric Sandwich ELISA Microarray for Shiga Toxins

    Directory of Open Access Journals (Sweden)

    Andrew Gehring

    2014-06-01

    Full Text Available Shiga toxins 1 and 2 (Stx1 and Stx2 from Shiga toxin-producing E. coli (STEC bacteria were simultaneously detected with a newly developed, high-throughput antibody microarray platform. The proteinaceous toxins were immobilized and sandwiched between biorecognition elements (monoclonal antibodies and pooled horseradish peroxidase (HRP-conjugated monoclonal antibodies. Following the reaction of HRP with the precipitating chromogenic substrate (metal enhanced 3,3-diaminobenzidine tetrahydrochloride or DAB, the formation of a colored product was quantitatively measured with an inexpensive flatbed page scanner. The colorimetric ELISA microarray was demonstrated to detect Stx1 and Stx2 at levels as low as ~4.5 ng/mL within ~2 h of total assay time with a narrow linear dynamic range of ~1–2 orders of magnitude and saturation levels well above background. Stx1 and/or Stx2 produced by various strains of STEC were also detected following the treatment of cultured cells with mitomycin C (a toxin-inducing antibiotic and/or B-PER (a cell-disrupting, protein extraction reagent. Semi-quantitative detection of Shiga toxins was demonstrated to be sporadic among various STEC strains following incubation with mitomycin C; however, further reaction with B-PER generally resulted in the detection of or increased detection of Stx1, relative to Stx2, produced by STECs inoculated into either axenic broth culture or culture broth containing ground beef.

  3. The transport behaviour of elemental mercury DNAPL in saturated porous media: analysis of field observations and two-phase flow modelling.

    Science.gov (United States)

    Sweijen, Thomas; Hartog, Niels; Marsman, Annemieke; Keijzer, Thomas J S

    2014-06-01

    Mercury is a contaminant of global concern. The use of elemental mercury in various (former) industrial processes, such as chlorine production at chlor-alkali plants, is known to have resulted in soil and groundwater contaminations worldwide. However, the subsurface transport behaviour of elemental mercury as an immiscible dense non-aqueous phase liquid (DNAPL) in porous media has received minimal attention to date. Even though, such insight would aid in the remediation effort of mercury contaminated sites. Therefore, in this study a detailed field characterization of elemental mercury DNAPL distribution with depth was performed together with two-phase flow modelling, using STOMP. This is to evaluate the dynamics of mercury DNAPL migration and the controls on its distribution in saturated porous media. Using a CPT-probe mounted with a digital camera, in-situ mercury DNAPL depth distribution was obtained at a former chlor-alkali-plant, down to 9 m below ground surface. Images revealing the presence of silvery mercury DNAPL droplets were used to quantify its distribution, characteristics and saturation, using an image analysis method. These field-observations with depth were compared with results from a one-dimensional two-phase flow model simulation for the same transect. Considering the limitations of this approach, simulations reasonably reflected the variability and range of the mercury DNAPL distribution. To further explore the impact of mercury's physical properties in comparison with more common DNAPLs, the migration of mercury and PCE DNAPL in several typical hydrological scenarios was simulated. Comparison of the simulations suggest that mercury's higher density is the overall controlling factor in controlling its penetration in saturated porous media, despite its higher resistance to flow due to its higher viscosity. Based on these results the hazard of spilled mercury DNAPL to cause deep contamination of groundwater systems seems larger than for any other

  4. Fractal analysis of fracture increasing spontaneous imbibition in porous media with gas-saturated

    KAUST Repository

    Cai, Jianchao; Sun, Shuyu

    2013-01-01

    Spontaneous imbibition (SI) of wetting liquid into matrix blocks due to capillary pressure is regarded as an important recovery mechanism in low permeability fractured reservoir. In this paper, an analytical model is proposed for characterizing SI horizontally from a single plane fracture into gas-saturated matrix blocks. The presented model is based on the fractal character of pores in porous matrix, with gravity force included in the entire imbibition process. The accumulated mass of wetting liquid imbibed into matrix blocks is related to a number of factors such as contact area, pore fractal dimension, tortuosity, maximum pore size, porosity, liquid density and viscosity, surface tension, contact angle, as well as height and tilt angle of the fracture. The mechanism of fracture-enhanced SI is analyzed accordingly. Because of the effect of fracture, the gravity force is positive to imbibition process. Additionally, the farther away from the fracture top of the pore, the more influential the hydrostatic pressure is upon the imbibition action. The presented fractal analysis of horizontal spontaneous imbibition from a single fracture could also shed light on the scaling study of the mass transfer function between matrix and fracture system of fractured reservoirs. © 2013 World Scientific Publishing Company.

  5. Fractal analysis of fracture increasing spontaneous imbibition in porous media with gas-saturated

    KAUST Repository

    Cai, Jianchao

    2013-08-01

    Spontaneous imbibition (SI) of wetting liquid into matrix blocks due to capillary pressure is regarded as an important recovery mechanism in low permeability fractured reservoir. In this paper, an analytical model is proposed for characterizing SI horizontally from a single plane fracture into gas-saturated matrix blocks. The presented model is based on the fractal character of pores in porous matrix, with gravity force included in the entire imbibition process. The accumulated mass of wetting liquid imbibed into matrix blocks is related to a number of factors such as contact area, pore fractal dimension, tortuosity, maximum pore size, porosity, liquid density and viscosity, surface tension, contact angle, as well as height and tilt angle of the fracture. The mechanism of fracture-enhanced SI is analyzed accordingly. Because of the effect of fracture, the gravity force is positive to imbibition process. Additionally, the farther away from the fracture top of the pore, the more influential the hydrostatic pressure is upon the imbibition action. The presented fractal analysis of horizontal spontaneous imbibition from a single fracture could also shed light on the scaling study of the mass transfer function between matrix and fracture system of fractured reservoirs. © 2013 World Scientific Publishing Company.

  6. Quantitative chemical exchange saturation transfer (qCEST) MRI - omega plot analysis of RF-spillover-corrected inverse CEST ratio asymmetry for simultaneous determination of labile proton ratio and exchange rate.

    Science.gov (United States)

    Wu, Renhua; Xiao, Gang; Zhou, Iris Yuwen; Ran, Chongzhao; Sun, Phillip Zhe

    2015-03-01

    Chemical exchange saturation transfer (CEST) MRI is sensitive to labile proton concentration and exchange rate, thus allowing measurement of dilute CEST agent and microenvironmental properties. However, CEST measurement depends not only on the CEST agent properties but also on the experimental conditions. Quantitative CEST (qCEST) analysis has been proposed to address the limitation of the commonly used simplistic CEST-weighted calculation. Recent research has shown that the concomitant direct RF saturation (spillover) effect can be corrected using an inverse CEST ratio calculation. We postulated that a simplified qCEST analysis is feasible with omega plot analysis of the inverse CEST asymmetry calculation. Specifically, simulations showed that the numerically derived labile proton ratio and exchange rate were in good agreement with input values. In addition, the qCEST analysis was confirmed experimentally in a phantom with concurrent variation in CEST agent concentration and pH. Also, we demonstrated that the derived labile proton ratio increased linearly with creatine concentration (P analysis can simultaneously determine labile proton ratio and exchange rate in a relatively complex in vitro CEST system. Copyright © 2015 John Wiley & Sons, Ltd.

  7. A demonstration experiment for studying the properties of saturated vapor

    Science.gov (United States)

    Grebenev, Igor V.; Lebedeva, Olga V.; Polushkina, Svetlana V.

    2017-11-01

    The paper proposes an important demonstration experiment that can be used at secondary schools in physics. The described experiment helps students learn the main concepts of the topic ‘saturated vapor’, namely, evaporation, condensation, dynamic equilibrium, saturation vapor, partial pressure, and the dependence of saturated vapor pressure on temperature.

  8. A new theoretical interpretation of Archie's saturation exponent

    Directory of Open Access Journals (Sweden)

    P. W. J. Glover

    2017-07-01

    Full Text Available This paper describes the extension of the concepts of connectedness and conservation of connectedness that underlie the generalized Archie's law for n phases to the interpretation of the saturation exponent. It is shown that the saturation exponent as defined originally by Archie arises naturally from the generalized Archie's law. In the generalized Archie's law the saturation exponent of any given phase can be thought of as formally the same as the phase (i.e. cementation exponent, but with respect to a reference subset of phases in a larger n-phase medium. Furthermore, the connectedness of each of the phases occupying a reference subset of an n-phase medium can be related to the connectedness of the subset itself by Gi = GrefSini. This leads naturally to the idea of the term Sini for each phase i being a fractional connectedness, where the fractional connectednesses of any given reference subset sum to unity in the same way that the connectednesses sum to unity for the whole medium. One of the implications of this theory is that the saturation exponent of any phase can be now be interpreted as the rate of change of the fractional connectedness with saturation and connectivity within the reference subset.

  9. High speed drying of saturated steam

    International Nuclear Information System (INIS)

    Marty, C.; Peyrelongue, J.P.

    1993-01-01

    This paper describes the development of the drying process for the saturated steam used in the PWR nuclear plant turbines in order to prevent negative effects of water on turbine efficiency, maintenance costs and equipment lifetime. The high speed drying concept is based on rotating the incoming saturated steam in order to separate water which is more denser than the steam; the water film is then extracted through an annular slot. A multicellular modular equipment has been tested. Applications on high and low pressure extraction of various PWR plants are described (Bugey, Loviisa)

  10. Automation in Cytomics: A Modern RDBMS Based Platform for Image Analysis and Management in High-Throughput Screening Experiments

    NARCIS (Netherlands)

    E. Larios (Enrique); Y. Zhang (Ying); K. Yan (Kuan); Z. Di; S. LeDévédec (Sylvia); F.E. Groffen (Fabian); F.J. Verbeek

    2012-01-01

    textabstractIn cytomics bookkeeping of the data generated during lab experiments is crucial. The current approach in cytomics is to conduct High-Throughput Screening (HTS) experiments so that cells can be tested under many different experimental conditions. Given the large amount of different

  11. Molybdenite saturation in silicic magmas: Occurrence and petrological implications

    Science.gov (United States)

    Audetat, A.; Dolejs, D.; Lowenstern, J. B.

    2011-01-01

    We identified molybdenite (MoS2) as an accessory magmatic phase in 13 out of 27 felsic magma systems examined worldwide. The molybdenite occurs as small (molybdenite-saturated samples reveal 1-13 ppm Mo in the melt and geochemical signatures that imply a strong link to continental rift basalt-rhyolite associations. In contrast, arc-associated rhyolites are rarely molybdenite-saturated, despite similar Mo concentrations. This systematic dependence on tectonic setting seems to reflect the higher oxidation state of arc magmas compared with within-plate magmas. A thermodynamic model devised to investigate the effects of T, f O2 and f S2 on molybdenite solubility reliably predicts measured Mo concentrations in molybdenite-saturated samples if the magmas are assumed to have been saturated also in pyrrhotite. Whereas pyrrhotite microphenocrysts have been observed in some of these samples, they have not been observed from other molybdenite-bearing magmas. Based on the strong influence of f S2 on molybdenite solubility we calculate that also these latter magmas must have been at (or very close to) pyrrhotite saturation. In this case the Mo concentration of molybdenite-saturated melts can be used to constrain both magmatic f O2 and f S2 if temperature is known independently (e.g. by zircon saturation thermometry). Our model thus permits evaluation of magmatic f S2, which is an important variable but is difficult to estimate otherwise, particularly in slowly cooled rocks. ?? The Author 2011. Published by Oxford University Press. All rights reserved.

  12. Nuclear determination of saturation profiles in core plugs

    International Nuclear Information System (INIS)

    Sletsgaard, J.; Oelgaard, P.L.

    1997-01-01

    A method to determine liquid saturations in core plugs during flooding is of importance when the relative permeability and capillary pressure function are to be determined. This part of the EFP-95 project uses transmission of γ-radiation to determine these saturations. In γ-transmission measurements, the electron density of the given substance is measured. This is an advantage as compared to methods that use electric conductivity, since neither oil nor gas conducts electricity. At the moment a single 137 Cs-source is used, but a theoretical investigation of whether it is possible to determine three saturations, using two radioactive sources with different γ-energies, has been performed. Measurements were made on three core plugs. To make sure that the measurements could be reproduced, all the plugs had a point of reference, i.e. a mark so that it was possible to place the plug same way every time. Two computer programs for calculation of saturation and porosity and the experimental setup are listed. (EG)

  13. Continuous-wave to pulse regimes for a family of passively mode-locked lasers with saturable nonlinearity

    Science.gov (United States)

    Dikandé, Alain M.; Voma Titafan, J.; Essimbi, B. Z.

    2017-10-01

    The transition dynamics from continuous-wave to pulse regimes of operation for a generic model of passively mode-locked lasers with saturable absorbers, characterized by an active medium with non-Kerr nonlinearity, are investigated analytically and numerically. The system is described by a complex Ginzburg-Landau equation with a general m:n saturable nonlinearity (i.e {I}m/{(1+{{Γ }}I)}n, where I is the field intensity and m and n are two positive numbers), coupled to a two-level gain equation. An analysis of stability of continuous waves, following the modulational instability approach, provides a global picture of the self-starting dynamics in the system. The analysis reveals two distinct routes depending on values of the couple (m, n), and on the dispersion regime: in the normal dispersion regime, when m = 2 and n is arbitrary, the self-starting requires positive values of the fast saturable absorber and nonlinearity coefficients, but negative values of these two parameters for the family with m = 0. However, when the spectral filter is negative, the laser can self-start for certain values of the input field and the nonlinearity saturation coefficient Γ. The present work provides a general map for the self-starting mechanisms of rare-earth doped figure-eight fiber lasers, as well as Kerr-lens mode-locked solid-state lasers.

  14. Estimating the cardiovascular mortality burden attributable to the European Common Agricultural Policy on dietary saturated fats.

    Science.gov (United States)

    Lloyd-Williams, Ffion; O'Flaherty, Martin; Mwatsama, Modi; Birt, Christopher; Ireland, Robin; Capewell, Simon

    2008-07-01

    To estimate the burden of cardiovascular disease within 15 European Union countries (before the 2004 enlargement) as a result of excess dietary saturated fats attributable to the Common Agricultural Policy (CAP). A spreadsheet model was developed to synthesize data on population, diet, cholesterol levels and mortality rates. A conservative estimate of a reduction in saturated fat consumption of just 2.2 g was chosen, representing 1% of daily energy intake. The fall in serum cholesterol concentration was then calculated, assuming that this 1% reduction in saturated fat consumption was replaced with 0.5% monounsaturated and 0.5% polyunsaturated fats. The resulting reduction in cardiovascular and stroke deaths was then estimated, and a sensitivity analysis conducted. Reducing saturated fat consumption by 1% and increasing monounsaturated and polyunsaturated fat by 0.5% each would lower blood cholesterol levels by approximately 0.06 mmol/l, resulting in approximately 9800 fewer coronary heart disease deaths and 3000 fewer stroke deaths each year. The cardiovascular disease burden attributable to CAP appears substantial. Furthermore, these calculations were conservative estimates, and the true mortality burden may be higher. The analysis contributes to the current wider debate concerning the relationship between CAP, health and chronic disease across Europe, together with recent international developments and commitments to reduce chronic diseases. The reported mortality estimates should be considered in relation to the current CAP and any future reforms.

  15. Arterial blood oxygen saturation during blood pressure cuff-induced hypoperfusion

    Energy Technology Data Exchange (ETDEWEB)

    Kyriacou, P A [School of Engineering and Mathematical Sciences, City University, London EC1V 0HB (United Kingdom); Shafqat, K [School of Engineering and Mathematical Sciences, City University, London EC1V 0HB (United Kingdom); Pal, S K [St Andrew' s Centre for Plastic Surgery and Burns, Broomfield Hospital, Chelmsford, CM1 7ET (United Kingdom)

    2007-10-15

    Pulse oximetry has been one of the most significant technological advances in clinical monitoring in the last two decades. Pulse oximetry is a non-invasive photometric technique that provides information about the arterial blood oxygen saturation (SpO{sub 2}) and heart rate, and has widespread clinical applications. When peripheral perfusion is poor, as in states of hypovolaemia, hypothermia and vasoconstriction, oxygenation readings become unreliable or cease. The problem arises because conventional pulse oximetry sensors must be attached to the most peripheral parts of the body, such as finger, ear or toe, where pulsatile flow is most easily compromised. Pulse oximeters estimate arterial oxygen saturation by shining light at two different wavelengths, red and infrared, through vascular tissue. In this method the ac pulsatile photoplethysmographic (PPG) signal associated with cardiac contraction is assumed to be attributable solely to the arterial blood component. The amplitudes of the red and infrared ac PPG signals are sensitive to changes in arterial oxygen saturation because of differences in the light absorption of oxygenated and deoxygenated haemoglobin at these two wavelengths. From the ratios of these amplitudes, and the corresponding dc photoplethysmographic components, arterial blood oxygen saturation (SpO{sub 2}) is estimated. Hence, the technique of pulse oximetry relies on the presence of adequate peripheral arterial pulsations, which are detected as photoplethysmographic (PPG) signals. The aim of this study was to investigate the effect of pressure cuff-induced hypoperfusion on photoplethysmographic signals and arterial blood oxygen saturation using a custom made finger blood oxygen saturation PPG/SpO{sub 2} sensor and a commercial finger pulse oximeter. Blood oxygen saturation values from the custom oxygen saturation sensor and a commercial finger oxygen saturation sensor were recorded from 14 healthy volunteers at various induced brachial pressures

  16. Analysis of grain growth process in melt spun Fe-B alloys under the initial saturated grain boundary segregation condition

    International Nuclear Information System (INIS)

    Chen, Z.; Liu, F.; Yang, X.Q.; Fan, Y.; Shen, C.J.

    2012-01-01

    Highlights: → We compared pure kinetic, pure thermodynamic and extended thermo-kinetic models. → An initial saturated GB segregation condition of nanoscale Fe-B alloys was determined. → The controlled-mechanism was proposed using two characteristic times (t 1 and t 2 ). - Abstract: A grain growth process in the melt spun low-solid-solubility Fe-B alloys was analyzed under the initial saturated grain boundary (GB) segregation condition. Applying melt spinning technique, single-phase supersaturated nanograins were prepared. Grain growth behavior of the single-phase supersaturated nanograins was investigated by performing isothermal annealing at 700 deg. C. Combined with the effect of GB segregation on the initial GB excess amount, the thermo-kinetic model [Chen et al., Acta Mater. 57 (2009) 1466] was extended to describe the initial GB segregation condition of nanoscale Fe-B alloys. In comparison of pure kinetic model, pure thermodynamic model and the extended thermo-kinetic model, an initial saturated GB segregation condition was determined. The controlled-mechanism of grain growth under initial saturated GB segregation condition was proposed using two characteristic annealing times (t 1 and t 2 ), which included a mainly kinetic-controlled process (t ≤ t 1 ), a transition from kinetic-mechanism to thermodynamic-mechanism (t 1 2 ) and pure thermodynamic-controlled process (t ≥ t 2 ).

  17. A high-throughput fluorescence resonance energy transfer (FRET)-based endothelial cell apoptosis assay and its application for screening vascular disrupting agents

    International Nuclear Information System (INIS)

    Zhu, Xiaoming; Fu, Afu; Luo, Kathy Qian

    2012-01-01

    Highlights: ► An endothelial cell apoptosis assay using FRET-based biosensor was developed. ► The fluorescence of the cells changed from green to blue during apoptosis. ► This method was developed into a high-throughput assay in 96-well plates. ► This assay was applied to screen vascular disrupting agents. -- Abstract: In this study, we developed a high-throughput endothelial cell apoptosis assay using a fluorescence resonance energy transfer (FRET)-based biosensor. After exposure to apoptotic inducer UV-irradiation or anticancer drugs such as paclitaxel, the fluorescence of the cells changed from green to blue. We developed this method into a high-throughput assay in 96-well plates by measuring the emission ratio of yellow fluorescent protein (YFP) to cyan fluorescent protein (CFP) to monitor the activation of a key protease, caspase-3, during apoptosis. The Z′ factor for this assay was above 0.5 which indicates that this assay is suitable for a high-throughput analysis. Finally, we applied this functional high-throughput assay for screening vascular disrupting agents (VDA) which could induce endothelial cell apoptosis from our in-house compounds library and dioscin was identified as a hit. As this assay allows real time and sensitive detection of cell apoptosis, it will be a useful tool for monitoring endothelial cell apoptosis in living cell situation and for identifying new VDA candidates via a high-throughput screening.

  18. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    Science.gov (United States)

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  19. High throughput screening of phenoxy carboxylic acids with dispersive solid phase extraction followed by direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Wang, Jiaqin; Zhu, Jun; Si, Ling; Du, Qi; Li, Hongli; Bi, Wentao; Chen, David Da Yong

    2017-12-15

    A high throughput, low environmental impact methodology for rapid determination of phenoxy carboxylic acids (PCAs) in water samples was developed by combing dispersive solid phase extraction (DSPE) using velvet-like graphitic carbon nitride (V-g-C 3 N 4 ) and direct analysis in real time mass spectrometry (DART-MS). Due to the large surface area and good dispersity of V-g-C 3 N 4 , the DSPE of PCAs in water was completed within 20 s, and the elution of PCAs was accomplished in 20 s as well using methanol. The eluents were then analyzed and quantified using DART ionization source coupled to a high resolution mass spectrometer, where an internal standard was added in the samples. The limit of detection ranged from 0.5 ng L -1 to 2 ng L -1 on the basis of 50 mL water sample; the recovery 79.9-119.1%; and the relative standard deviation 0.23%-9.82% (≥5 replicates). With the ease of use and speed of DART-MS, the whole protocol can complete within mere minutes, including sample preparation, extraction, elution, detection and quantitation. The methodology developed here is simple, fast, sensitive, quantitative, requiring little sample preparation and consuming significantly less toxic organic solvent, which can be used for high throughput screening of PCAs and potentially other contaminants in water. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. On the propagation of a coupled saturation and pressure front

    Energy Technology Data Exchange (ETDEWEB)

    Vasco, D. W.

    2010-12-01

    Using an asymptotic technique, valid for a medium with smoothly varying heterogeneity, I derive an expression for the velocity of a propagating, coupled saturation and pressure front. Due to the nonlinearity of the governing equations, the velocity of the propagating front depends upon the magnitude of the saturation and pressure changes across the front in addition to the properties of the medium. Thus, the expression must be evaluated in conjunction with numerical reservoir simulation. The propagation of the two-phase front is governed by the background saturation distribution, the saturation-dependent component of the fluid mobility, the porosity, the permeability, the capillary pressure function, the medium compressibility, and the ratio of the slopes of the relative permeability curves. Numerical simulation of water injection into a porous layer saturated with a nonaqueous phase liquid indicates that two modes of propagation are important. The fastest mode of propagation is a pressure-dominated disturbance that travels through the saturated layer. This is followed, much later, by a coupled mode with a large saturation change. These two modes are also observed in a simulation using a heterogeneous porous layer. A comparison between the propagation times estimated from the results of the numerical simulation and predictions from the asymptotic expression indicates overall agreement.

  1. Semiconductor saturable absorbers for ultrafast THz signals

    DEFF Research Database (Denmark)

    Hoffmann, Matthias C.; Turchinovich, Dmitry

    We demonstrate saturable absorber behavior of n-type semiconductors in the THz frequency range using nonlinear THz spectroscopy. Further, we observe THz pulse shortening and increase of the group refractive index at high field strengths.......We demonstrate saturable absorber behavior of n-type semiconductors in the THz frequency range using nonlinear THz spectroscopy. Further, we observe THz pulse shortening and increase of the group refractive index at high field strengths....

  2. High-throughput SHAPE analysis reveals structures in HIV-1 genomic RNA strongly conserved across distinct biological states.

    Directory of Open Access Journals (Sweden)

    Kevin A Wilkinson

    2008-04-01

    Full Text Available Replication and pathogenesis of the human immunodeficiency virus (HIV is tightly linked to the structure of its RNA genome, but genome structure in infectious virions is poorly understood. We invent high-throughput SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension technology, which uses many of the same tools as DNA sequencing, to quantify RNA backbone flexibility at single-nucleotide resolution and from which robust structural information can be immediately derived. We analyze the structure of HIV-1 genomic RNA in four biologically instructive states, including the authentic viral genome inside native particles. Remarkably, given the large number of plausible local structures, the first 10% of the HIV-1 genome exists in a single, predominant conformation in all four states. We also discover that noncoding regions functioning in a regulatory role have significantly lower (p-value < 0.0001 SHAPE reactivities, and hence more structure, than do viral coding regions that function as the template for protein synthesis. By directly monitoring protein binding inside virions, we identify the RNA recognition motif for the viral nucleocapsid protein. Seven structurally homologous binding sites occur in a well-defined domain in the genome, consistent with a role in directing specific packaging of genomic RNA into nascent virions. In addition, we identify two distinct motifs that are targets for the duplex destabilizing activity of this same protein. The nucleocapsid protein destabilizes local HIV-1 RNA structure in ways likely to facilitate initial movement both of the retroviral reverse transcriptase from its tRNA primer and of the ribosome in coding regions. Each of the three nucleocapsid interaction motifs falls in a specific genome domain, indicating that local protein interactions can be organized by the long-range architecture of an RNA. High-throughput SHAPE reveals a comprehensive view of HIV-1 RNA genome structure, and further

  3. A nanofluidic bioarray chip for fast and high-throughput detection of antibodies in biological fluids

    Science.gov (United States)

    Lee, Jonathan; Gulzar, Naveed; Scott, Jamie K.; Li, Paul C. H.

    2012-10-01

    Immunoassays have become a standard in secretome analysis in clinical and research analysis. In this field there is a need for a high throughput method that uses low sample volumes. Microfluidics and nanofluidics have been developed for this purpose. Our lab has developed a nanofluidic bioarray (NBA) chip with the goal being a high throughput system that assays low sample volumes against multiple probes. A combination of horizontal and vertical channels are produced to create an array antigens on the surface of the NBA chip in one dimension that is probed by flowing in the other dimension antibodies from biological fluids. We have tested the NBA chip by immobilizing streptavidin and then biotinylated peptide to detect the presence of a mouse monoclonal antibody (MAb) that is specific for the peptide. Bound antibody is detected by an AlexaFluor 647 labeled goat (anti-mouse IgG) polyclonal antibody. Using the NBA chip, we have successfully detected peptide binding by small-volume (0.5 μl) samples containing 50 attomoles (100 pM) MAb.

  4. Development of high-throughput analysis system using highly-functional organic polymer monoliths

    International Nuclear Information System (INIS)

    Umemura, Tomonari; Kojima, Norihisa; Ueki, Yuji

    2008-01-01

    The growing demand for high-throughput analysis in the current competitive life sciences and industries has promoted the development of high-speed HPLC techniques and tools. As one of such tools, monolithic columns have attracted increasing attention and interest in the last decade due to the low flow-resistance and excellent mass transfer, allowing for rapid separations and reactions at high flow rates with minimal loss of column efficiency. Monolithic materials are classified into two main groups: silica- and organic polymer-based monoliths, each with their own advantages and disadvantages. Organic polymer monoliths have several distinct advantages in life-science research, including wide pH stability, less irreversible adsorption, facile preparation and modification. Thus, we have so far tried to develop organic polymer monoliths for various chemical operations, such as separation, extraction, preconcentration, and reaction. In the present paper, recent progress in the development of organic polymer monoliths is discussed. Especially, the procedure for the preparation of methacrylate-based monoliths with various functional groups is described, where the influence of different compositional and processing parameters on the monolithic structure is also addressed. Furthermore, the performance of the produced monoliths is demonstrated through the results for (1) rapid separations of alklybenzenes at high flow rates, (2) flow-through enzymatic digestion of cytochrome c on a trypsin-immobilized monolithic column, and (3) separation of the tryptic digest on a reversed-phase monolithic column. The flexibility and versatility of organic polymer monoliths will be beneficial for further enhancing analytical performance, and will open the way for new applications and opportunities both in scientific and industrial research. (author)

  5. High throughput proteomic analysis of the secretome in an explant model of articular cartilage inflammation

    Science.gov (United States)

    Clutterbuck, Abigail L.; Smith, Julia R.; Allaway, David; Harris, Pat; Liddell, Susan; Mobasheri, Ali

    2011-01-01

    This study employed a targeted high-throughput proteomic approach to identify the major proteins present in the secretome of articular cartilage. Explants from equine metacarpophalangeal joints were incubated alone or with interleukin-1beta (IL-1β, 10 ng/ml), with or without carprofen, a non-steroidal anti-inflammatory drug, for six days. After tryptic digestion of culture medium supernatants, resulting peptides were separated by HPLC and detected in a Bruker amaZon ion trap instrument. The five most abundant peptides in each MS scan were fragmented and the fragmentation patterns compared to mammalian entries in the Swiss-Prot database, using the Mascot search engine. Tryptic peptides originating from aggrecan core protein, cartilage oligomeric matrix protein (COMP), fibronectin, fibromodulin, thrombospondin-1 (TSP-1), clusterin (CLU), cartilage intermediate layer protein-1 (CILP-1), chondroadherin (CHAD) and matrix metalloproteinases MMP-1 and MMP-3 were detected. Quantitative western blotting confirmed the presence of CILP-1, CLU, MMP-1, MMP-3 and TSP-1. Treatment with IL-1β increased MMP-1, MMP-3 and TSP-1 and decreased the CLU precursor but did not affect CILP-1 and CLU levels. Many of the proteins identified have well-established extracellular matrix functions and are involved in early repair/stress responses in cartilage. This high throughput approach may be used to study the changes that occur in the early stages of osteoarthritis. PMID:21354348

  6. A primer on high-throughput computing for genomic selection.

    Science.gov (United States)

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  7. Determining the optimal size of small molecule mixtures for high throughput NMR screening

    International Nuclear Information System (INIS)

    Mercier, Kelly A.; Powers, Robert

    2005-01-01

    High-throughput screening (HTS) using NMR spectroscopy has become a common component of the drug discovery effort and is widely used throughout the pharmaceutical industry. NMR provides additional information about the nature of small molecule-protein interactions compared to traditional HTS methods. In order to achieve comparable efficiency, small molecules are often screened as mixtures in NMR-based assays. Nevertheless, an analysis of the efficiency of mixtures and a corresponding determination of the optimum mixture size (OMS) that minimizes the amount of material and instrumentation time required for an NMR screen has been lacking. A model for calculating OMS based on the application of the hypergeometric distribution function to determine the probability of a 'hit' for various mixture sizes and hit rates is presented. An alternative method for the deconvolution of large screening mixtures is also discussed. These methods have been applied in a high-throughput NMR screening assay using a small, directed library

  8. Use of high-throughput mass spectrometry to elucidate host-pathogen interactions in Salmonella

    Energy Technology Data Exchange (ETDEWEB)

    Rodland, Karin D.; Adkins, Joshua N.; Ansong, Charles; Chowdhury, Saiful M.; Manes, Nathan P.; Shi, Liang; Yoon, Hyunjin; Smith, Richard D.; Heffron, Fred

    2008-12-01

    New improvements to mass spectrometry include increased sensitivity, improvements in analyzing the collected data, and most important, from the standpoint of this review, a much higher throughput allowing analysis of many samples in a single day. This short review describes how host-pathogen interactions can be dissected by mass spectrometry using Salmonella as a model system. The approach allowed direct identification of the majority of annotate Salmonella proteins, how expression changed under various in vitro growth conditions, and how this relates to virulence and expression within host cell cells. One of the most significant findings is that a very high percentage of the all annotated genes (>20%) are regulated post-transcriptionally. In addition, new and unexpected interactions have been identified for several Salmonella virulence regulators that involve protein-protein interactions suggesting additional functions of the regulator in coordinating virulence expression. Overall high throughput mass spectrometer provides a new view of pathogen-host interaction emphasizing the protein products and defining how protein interactions determine the outcome of infection.

  9. Investigation of Slow-wave Activity Saturation during Surgical Anesthesia Reveals a Signature of Neural Inertia in Humans.

    Science.gov (United States)

    Warnaby, Catherine E; Sleigh, Jamie W; Hight, Darren; Jbabdi, Saad; Tracey, Irene

    2017-10-01

    Previously, we showed experimentally that saturation of slow-wave activity provides a potentially individualized neurophysiologic endpoint for perception loss during anesthesia. Furthermore, it is clear that induction and emergence from anesthesia are not symmetrically reversible processes. The observed hysteresis is potentially underpinned by a neural inertia mechanism as proposed in animal studies. In an advanced secondary analysis of 393 individual electroencephalographic data sets, we used slow-wave activity dose-response relationships to parameterize slow-wave activity saturation during induction and emergence from surgical anesthesia. We determined whether neural inertia exists in humans by comparing slow-wave activity dose responses on induction and emergence. Slow-wave activity saturation occurs for different anesthetics and when opioids and muscle relaxants are used during surgery. There was wide interpatient variability in the hypnotic concentrations required to achieve slow-wave activity saturation. Age negatively correlated with power at slow-wave activity saturation. On emergence, we observed abrupt decreases in slow-wave activity dose responses coincident with recovery of behavioral responsiveness in ~33% individuals. These patients are more likely to have lower power at slow-wave activity saturation, be older, and suffer from short-term confusion on emergence. Slow-wave activity saturation during surgical anesthesia implies that large variability in dosing is required to achieve a targeted potential loss of perception in individual patients. A signature for neural inertia in humans is the maintenance of slow-wave activity even in the presence of very-low hypnotic concentrations during emergence from anesthesia.

  10. High throughput nanoimprint lithography for semiconductor memory applications

    Science.gov (United States)

    Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun

    2017-03-01

    Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non

  11. Patient access in plastic surgery: an operational and financial analysis of service-based interventions to improve ambulatory throughput in an academic surgery practice.

    Science.gov (United States)

    Hultman, Charles Scott; Gilland, Wendell G; Weir, Samuel

    2015-06-01

    Inefficient patient throughput in a surgery practice can result in extended new patient backlogs, excessively long cycle times in the outpatient clinics, poor patient satisfaction, decreased physician productivity, and loss of potential revenue. This project assesses the efficacy of multiple throughput interventions in an academic, plastic surgery practice at a public university. We implemented a Patient Access and Efficiency (PAcE) initiative, funded and sponsored by our health care system, to improve patient throughput in the outpatient surgery clinic. Interventions included: (1) creation of a multidisciplinary team, led by a project redesign manager, that met weekly; (2) definition of goals, metrics, and target outcomes; 3) revision of clinic templates to reflect actual demand; 4) working down patient backlog through group visits; 5) booking new patients across entire practice; 6) assigning a physician's assistant to the preoperative clinic; and 7) designating a central scheduler to coordinate flow of information. Main outcome measures included: patient satisfaction using Press-Ganey surveys; complaints reported to patient relations; time to third available appointment; size of patient backlog; monthly clinic volumes with utilization rates and supply/demand curves; "chaos" rate (cancellations plus reschedules, divided by supply, within 48 hours of booked clinic date); patient cycle times with bottleneck analysis; physician productivity measured by work Relative Value Units (wRVUs); and downstream financial effects on billing, collection, accounts receivable (A/R), and payer mix. We collected, managed, and analyzed the data prospectively, comparing the pre-PAcE period (6 months) with the PAcE period (6 months). The PAcE initiative resulted in multiple improvements across the entire plastic surgery practice. Patient satisfaction increased only slightly from 88.5% to 90.0%, but the quarterly number of complaints notably declined from 17 to 9. Time to third

  12. Noise and non-linearities in high-throughput data

    International Nuclear Information System (INIS)

    Nguyen, Viet-Anh; Lió, Pietro; Koukolíková-Nicola, Zdena; Bagnoli, Franco

    2009-01-01

    High-throughput data analyses are becoming common in biology, communications, economics and sociology. The vast amounts of data are usually represented in the form of matrices and can be considered as knowledge networks. Spectra-based approaches have proved useful in extracting hidden information within such networks and for estimating missing data, but these methods are based essentially on linear assumptions. The physical models of matching, when applicable, often suggest non-linear mechanisms, that may sometimes be identified as noise. The use of non-linear models in data analysis, however, may require the introduction of many parameters, which lowers the statistical weight of the model. According to the quality of data, a simpler linear analysis may be more convenient than more complex approaches. In this paper, we show how a simple non-parametric Bayesian model may be used to explore the role of non-linearities and noise in synthetic and experimental data sets

  13. High-Throughput Block Optical DNA Sequence Identification.

    Science.gov (United States)

    Sagar, Dodderi Manjunatha; Korshoj, Lee Erik; Hanson, Katrina Bethany; Chowdhury, Partha Pratim; Otoupal, Peter Britton; Chatterjee, Anushree; Nagpal, Prashant

    2018-01-01

    Optical techniques for molecular diagnostics or DNA sequencing generally rely on small molecule fluorescent labels, which utilize light with a wavelength of several hundred nanometers for detection. Developing a label-free optical DNA sequencing technique will require nanoscale focusing of light, a high-throughput and multiplexed identification method, and a data compression technique to rapidly identify sequences and analyze genomic heterogeneity for big datasets. Such a method should identify characteristic molecular vibrations using optical spectroscopy, especially in the "fingerprinting region" from ≈400-1400 cm -1 . Here, surface-enhanced Raman spectroscopy is used to demonstrate label-free identification of DNA nucleobases with multiplexed 3D plasmonic nanofocusing. While nanometer-scale mode volumes prevent identification of single nucleobases within a DNA sequence, the block optical technique can identify A, T, G, and C content in DNA k-mers. The content of each nucleotide in a DNA block can be a unique and high-throughput method for identifying sequences, genes, and other biomarkers as an alternative to single-letter sequencing. Additionally, coupling two complementary vibrational spectroscopy techniques (infrared and Raman) can improve block characterization. These results pave the way for developing a novel, high-throughput block optical sequencing method with lossy genomic data compression using k-mer identification from multiplexed optical data acquisition. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Analysis of high-throughput biological data using their rank values.

    Science.gov (United States)

    Dembélé, Doulaye

    2018-01-01

    High-throughput biological technologies are routinely used to generate gene expression profiling or cytogenetics data. To achieve high performance, methods available in the literature become more specialized and often require high computational resources. Here, we propose a new versatile method based on the data-ordering rank values. We use linear algebra, the Perron-Frobenius theorem and also extend a method presented earlier for searching differentially expressed genes for the detection of recurrent copy number aberration. A result derived from the proposed method is a one-sample Student's t-test based on rank values. The proposed method is to our knowledge the only that applies to gene expression profiling and to cytogenetics data sets. This new method is fast, deterministic, and requires a low computational load. Probabilities are associated with genes to allow a statistically significant subset selection in the data set. Stability scores are also introduced as quality parameters. The performance and comparative analyses were carried out using real data sets. The proposed method can be accessed through an R package available from the CRAN (Comprehensive R Archive Network) website: https://cran.r-project.org/web/packages/fcros .

  15. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale L

    2005-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  16. High Throughput Synthesis and Screening for Agents Inhibiting Androgen Receptor Mediated Gene Transcription

    National Research Council Canada - National Science Library

    Boger, Dale

    2004-01-01

    .... This entails the high throughput synthesis of DNA binding agents related to distamycin, their screening for binding to androgen response elements using a new high throughput DNA binding screen...

  17. BioVLAB-MMIA-NGS: microRNA-mRNA integrated analysis using high-throughput sequencing data.

    Science.gov (United States)

    Chae, Heejoon; Rhee, Sungmin; Nephew, Kenneth P; Kim, Sun

    2015-01-15

    It is now well established that microRNAs (miRNAs) play a critical role in regulating gene expression in a sequence-specific manner, and genome-wide efforts are underway to predict known and novel miRNA targets. However, the integrated miRNA-mRNA analysis remains a major computational challenge, requiring powerful informatics systems and bioinformatics expertise. The objective of this study was to modify our widely recognized Web server for the integrated mRNA-miRNA analysis (MMIA) and its subsequent deployment on the Amazon cloud (BioVLAB-MMIA) to be compatible with high-throughput platforms, including next-generation sequencing (NGS) data (e.g. RNA-seq). We developed a new version called the BioVLAB-MMIA-NGS, deployed on both Amazon cloud and on a high-performance publicly available server called MAHA. By using NGS data and integrating various bioinformatics tools and databases, BioVLAB-MMIA-NGS offers several advantages. First, sequencing data is more accurate than array-based methods for determining miRNA expression levels. Second, potential novel miRNAs can be detected by using various computational methods for characterizing miRNAs. Third, because miRNA-mediated gene regulation is due to hybridization of an miRNA to its target mRNA, sequencing data can be used to identify many-to-many relationship between miRNAs and target genes with high accuracy. http://epigenomics.snu.ac.kr/biovlab_mmia_ngs/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Adaptive Neural Output Feedback Control for Uncertain Robot Manipulators with Input Saturation

    Directory of Open Access Journals (Sweden)

    Rong Mei

    2017-01-01

    Full Text Available This paper presents an adaptive neural output feedback control scheme for uncertain robot manipulators with input saturation using the radial basis function neural network (RBFNN and disturbance observer. First, the RBFNN is used to approximate the system uncertainty, and the unknown approximation error of the RBFNN and the time-varying unknown external disturbance of robot manipulators are integrated as a compounded disturbance. Then, the state observer and the disturbance observer are proposed to estimate the unmeasured system state and the unknown compounded disturbance based on RBFNN. At the same time, the adaptation technique is employed to tackle the control input saturation problem. Utilizing the estimate outputs of the RBFNN, the state observer, and the disturbance observer, the adaptive neural output feedback control scheme is developed for robot manipulators using the backstepping technique. The convergence of all closed-loop signals is rigorously proved via Lyapunov analysis and the asymptotically convergent tracking error is obtained under the integrated effect of the system uncertainty, the unmeasured system state, the unknown external disturbance, and the input saturation. Finally, numerical simulation results are presented to illustrate the effectiveness of the proposed adaptive neural output feedback control scheme for uncertain robot manipulators.

  19. Ultrafast THz Saturable Absorption in Doped Semiconductors

    DEFF Research Database (Denmark)

    Turchinovich, Dmitry; Hoffmann, Matthias C.

    2011-01-01

    We demonstrate ultrafast THz saturable absorption in n-doped semiconductors by nonlinear THz time-domain spectroscopy. This effect is caused by the semiconductor conductivity modulation due to electron heating and satellite-valley scattering in strong THz fields.......We demonstrate ultrafast THz saturable absorption in n-doped semiconductors by nonlinear THz time-domain spectroscopy. This effect is caused by the semiconductor conductivity modulation due to electron heating and satellite-valley scattering in strong THz fields....

  20. Saturable absorption in detonation nanodiamond dispersions

    Science.gov (United States)

    Vanyukov, Viatcheslav; Mikheev, Gennady; Mogileva, Tatyana; Puzyr, Alexey; Bondar, Vladimir; Lyashenko, Dmitry; Chuvilin, Andrey

    2017-07-01

    We report on a saturable absorption in aqueous dispersions of nanodiamonds with femtosecond laser pulse excitation at a wavelength of 795 nm. The open aperture Z-scan experiments reveal that in a wide range of nanodiamond particle sizes and concentrations, a light-induced increase of transmittance occurs. The transmittance increase originates from the saturation of light absorption and is associated with a light absorption at 1.5 eV by graphite and dimer chains (Pandey dimer chains). The obtained key nonlinear parameters of nanodiamond dispersions are compared with those of graphene and carbon nanotubes, which are widely used for the mode-locking.

  1. On the saturation of astrophysical dynamos

    DEFF Research Database (Denmark)

    Dorch, Bertil; Archontis, Vasilis

    2004-01-01

    In the context of astrophysical dynamos we illustrate that the no-cosines flow, with zero mean helicity, can drive fast dynamo action and we study the dynamo's mode of operation during both the linear and non-linear saturation regimes. It turns out that in addition to a high growth rate in the li......In the context of astrophysical dynamos we illustrate that the no-cosines flow, with zero mean helicity, can drive fast dynamo action and we study the dynamo's mode of operation during both the linear and non-linear saturation regimes. It turns out that in addition to a high growth rate...

  2. High-throughput optical system for HDES hyperspectral imager

    Science.gov (United States)

    Václavík, Jan; Melich, Radek; Pintr, Pavel; Pleštil, Jan

    2015-01-01

    Affordable, long-wave infrared hyperspectral imaging calls for use of an uncooled FPA with high-throughput optics. This paper describes the design of the optical part of a stationary hyperspectral imager in a spectral range of 7-14 um with a field of view of 20°×10°. The imager employs a push-broom method made by a scanning mirror. High throughput and a demand for simplicity and rigidity led to a fully refractive design with highly aspheric surfaces and off-axis positioning of the detector array. The design was optimized to exploit the machinability of infrared materials by the SPDT method and a simple assemblage.

  3. Automated high-throughput measurement of body movements and cardiac activity of Xenopus tropicalis tadpoles

    Directory of Open Access Journals (Sweden)

    Kay Eckelt

    2014-07-01

    Full Text Available Xenopus tadpoles are an emerging model for developmental, genetic and behavioral studies. A small size, optical accessibility of most of their organs, together with a close genetic and structural relationship to humans make them a convenient experimental model. However, there is only a limited toolset available to measure behavior and organ function of these animals at medium or high-throughput. Herein, we describe an imaging-based platform to quantify body and autonomic movements of Xenopus tropicalis tadpoles of advanced developmental stages. Animals alternate periods of quiescence and locomotor movements and display buccal pumping for oxygen uptake from water and rhythmic cardiac movements. We imaged up to 24 animals in parallel and automatically tracked and quantified their movements by using image analysis software. Animal trajectories, moved distances, activity time, buccal pumping rates and heart beat rates were calculated and used to characterize the effects of test compounds. We evaluated the effects of propranolol and atropine, observing a dose-dependent bradycardia and tachycardia, respectively. This imaging and analysis platform is a simple, cost-effective high-throughput in vivo assay system for genetic, toxicological or pharmacological characterizations.

  4. Aspects of the use of saturated fluorocarbon fluids in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Hallewell, G., E-mail: Gregory.Hallewell@cern.c [Centre de Physique des Particules de Marseille, 163 Avenue de Luminy, Case 907, 13288 Marseille Cedex 09 (France)

    2011-05-21

    The excellent dielectric properties of saturated fluorocarbons have allowed their use in direct immersion liquid cooling of electronics, including supercomputers and as heat transfer media in vapour phase soldering and burn-in testing of electronics. Their high density, UV transparency, non-flammability, non-toxicity and radiation tolerance have led to their use as liquid and gas radiator media for RICH detectors in numerous particle physics experiments. Systems to circulate and purify saturated fluorocarbon Cherenkov radiator vapours often rely on thermodynamic evaporation-condensation cycles similar to those used in refrigeration. Their use as evaporative refrigerants was pioneered for the ATLAS silicon tracker, and they are now also used as evaporative coolants in ALICE and TOTEM and as liquid coolants in ATLAS and CMS. Ultrasonic techniques for vapour phase analysis of fluorocarbon mixtures-developed for the SLAC SLD barrel CRID radiator during the 1980s as an alternative to UV refractometry are again under development for the ATLAS tracker evaporative cooling system. Examples of fluorocarbon circulation systems, together with purification and analysis techniques for these versatile fluids are mentioned.

  5. Aspects of the use of saturated fluorocarbon fluids in high energy physics

    International Nuclear Information System (INIS)

    Hallewell, G.

    2011-01-01

    The excellent dielectric properties of saturated fluorocarbons have allowed their use in direct immersion liquid cooling of electronics, including supercomputers and as heat transfer media in vapour phase soldering and burn-in testing of electronics. Their high density, UV transparency, non-flammability, non-toxicity and radiation tolerance have led to their use as liquid and gas radiator media for RICH detectors in numerous particle physics experiments. Systems to circulate and purify saturated fluorocarbon Cherenkov radiator vapours often rely on thermodynamic evaporation-condensation cycles similar to those used in refrigeration. Their use as evaporative refrigerants was pioneered for the ATLAS silicon tracker, and they are now also used as evaporative coolants in ALICE and TOTEM and as liquid coolants in ATLAS and CMS. Ultrasonic techniques for vapour phase analysis of fluorocarbon mixtures-developed for the SLAC SLD barrel CRID radiator during the 1980s as an alternative to UV refractometry are again under development for the ATLAS tracker evaporative cooling system. Examples of fluorocarbon circulation systems, together with purification and analysis techniques for these versatile fluids are mentioned.

  6. The viscosity of the refrigerant 1,1-difluoroethane along the saturation line

    Science.gov (United States)

    van der Gulik, P. S.

    1993-07-01

    The viscosity coefficient of the refrigerant R152a (1,1-difluoroethane) has been measured along the saturation line both in the saturated liquid and in the saturated vapor. The data have been obtained every 10 K from 243 up to 393 K by means of a vibrating-wire viscometer using the free damped oscillation method. The density along the saturation line was calculated from the equation of state given by Tamatsu et al. with application of the saturated vapor-pressure correlation given by Higashi et al. An interesting result is that in the neighborhood of the critical point, the kinematic viscosity of the saturated liquid seems to coincide with that of the saturated vapor. The results for the saturated liquid are in satisfying agreement with those of Kumagai and Takahashi and of Phillips and Murphy. A comparison of the saturatedvaport data with the unsaturated-vapor data of Takahashi et al. shows some discrepancies.

  7. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays

    KAUST Repository

    Sturino, Joseph

    2010-01-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  8. High-throughput GPU-based LDPC decoding

    Science.gov (United States)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  9. Synthetic Biomaterials to Rival Nature's Complexity-a Path Forward with Combinatorics, High-Throughput Discovery, and High-Content Analysis.

    Science.gov (United States)

    Zhang, Douglas; Lee, Junmin; Kilian, Kristopher A

    2017-10-01

    Cells in tissue receive a host of soluble and insoluble signals in a context-dependent fashion, where integration of these cues through a complex network of signal transduction cascades will define a particular outcome. Biomaterials scientists and engineers are tasked with designing materials that can at least partially recreate this complex signaling milieu towards new materials for biomedical applications. In this progress report, recent advances in high throughput techniques and high content imaging approaches that are facilitating the discovery of efficacious biomaterials are described. From microarrays of synthetic polymers, peptides and full-length proteins, to designer cell culture systems that present multiple biophysical and biochemical cues in tandem, it is discussed how the integration of combinatorics with high content imaging and analysis is essential to extracting biologically meaningful information from large scale cellular screens to inform the design of next generation biomaterials. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Low-cost guaranteed-throughput communication ring for real-time streaming MPSoCs

    NARCIS (Netherlands)

    Dekens, B.H.J.; Kurtin, Philip Sebastian; Bekooij, Marco Jan Gerrit; Smit, Gerardus Johannes Maria

    2013-01-01

    Connection-oriented guaranteed-throughput mesh-based networks on chip have been proposed as a replacement for buses in real-time embedded multiprocessor systems such as software defined radios. Even with attractive features like throughput and latency guarantees they are not always used because

  11. Experimental and numerical study on thermal conductivity of partially saturated unconsolidated sands

    Science.gov (United States)

    Lee, Youngmin; Keehm, Youngseuk; Kim, Seong-Kyun; Shin, Sang Ho

    2016-04-01

    A class of problems in heat flow applications requires an understanding of how water saturation affects thermal conductivity in the shallow subsurface. We conducted a series of experiments using a sand box to evaluate thermal conductivity (TC) of partially saturated unconsolidated sands under varying water saturation (Sw). We first saturated sands fully with water and varied water saturation by drainage through the bottom of the sand box. Five water-content sensors were integrated vertically into the sand box to monitor water saturation changes and a needle probe was embedded to measure thermal conductivity of partially saturated sands. The experimental result showed that thermal conductivity decreases from 2.5 W/mK for fully saturated sands to 0.7 W/mK when water saturation is 5%. We found that the decreasing trend is quite non-linear: highly sensitive at very high and low water saturations. However, the boundary effects on the top and the bottom of the sand box seemed to be responsible for this high nonlinearity. We also found that the determination of water saturation is quite important: the saturation by averaging values from all five sensors and that from the sensor at the center position, showed quite different trends in the TC-Sw domain. In parallel, we conducted a pore-scale numerical modeling, which consists of the steady-state two-phase Lattice-Boltzmann simulator and FEM thermal conduction simulator on digital pore geometry of sand aggregation. The simulation results showed a monotonous decreasing trend, and are reasonably well matched with experimental data when using average water saturations. We concluded that thermal conductivity would decrease smoothly as water saturation decreases if we can exclude boundary effects. However, in dynamic conditions, i.e. imbibition or drainage, the thermal conductivity might show hysteresis, which can be investigated with pore-scale numerical modeling with unsteady-state two-phase flow simulators in our future work.

  12. High-throughput single nucleotide polymorphism genotyping using nanofluidic Dynamic Arrays

    Directory of Open Access Journals (Sweden)

    Crenshaw Andrew

    2009-01-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs have emerged as the genetic marker of choice for mapping disease loci and candidate gene association studies, because of their high density and relatively even distribution in the human genomes. There is a need for systems allowing medium multiplexing (ten to hundreds of SNPs with high throughput, which can efficiently and cost-effectively generate genotypes for a very large sample set (thousands of individuals. Methods that are flexible, fast, accurate and cost-effective are urgently needed. This is also important for those who work on high throughput genotyping in non-model systems where off-the-shelf assays are not available and a flexible platform is needed. Results We demonstrate the use of a nanofluidic Integrated Fluidic Circuit (IFC - based genotyping system for medium-throughput multiplexing known as the Dynamic Array, by genotyping 994 individual human DNA samples on 47 different SNP assays, using nanoliter volumes of reagents. Call rates of greater than 99.5% and call accuracies of greater than 99.8% were achieved from our study, which demonstrates that this is a formidable genotyping platform. The experimental set up is very simple, with a time-to-result for each sample of about 3 hours. Conclusion Our results demonstrate that the Dynamic Array is an excellent genotyping system for medium-throughput multiplexing (30-300 SNPs, which is simple to use and combines rapid throughput with excellent call rates, high concordance and low cost. The exceptional call rates and call accuracy obtained may be of particular interest to those working on validation and replication of genome- wide- association (GWA studies.

  13. Improving tokamak vertical position control in the presence of power supply voltage saturation

    International Nuclear Information System (INIS)

    Favez, J-Y; Lister, J B; Muellhaupt, Ph; Srinivasan, B

    2005-01-01

    The control of the current, position and shape of an elongated cross-section tokamak plasma is complicated by the so-called instability of the current vertical position. Linearized models all share the feature of a single unstable eigenmode, attributable to this vertical instability of the plasma equilibrium movement, and a large number of stable or marginally stable eigenmodes, attributable to zero or positive resistance in all other model circuit equations. Due to the size and therefore cost of the ITER tokamak, there will naturally be smaller margins in the poloidal field coil power supplies, implying that the feedback control will experience actuator saturation during large transients due to a variety of plasma disturbances. Current saturation is relatively benign, due to the integrating nature of the tokamak, resulting in a reasonable time horizon for strategically handling the approach to saturation which leads to the loss of one degree of freedom in the feedback control for each saturated coil. On the other hand, voltage saturation is produced by the feedback controller itself, with no intrinsic delay. This paper presents a feedback controller design approach which explicitly takes saturation of the power supply voltage into account when producing the power supply demand signals. We consider the vertically stabilizing part of the ITER controller (fast controller) with one power supply and therefore a single saturated input. We consider an existing ITER controller and enlarge its region of attraction to the full null controllable region by adding a continuous nonlinearity into the control. In a system with a single unstable eigenmode and a single stable eigenmode we have already provided a proof of the asymptotical stability of the closed loop system, and we have examined the performance of this new continuous nonlinear controller. We have subsequently extended this analysis to a system with a single eigenmode and multiple stable eigenmodes. The method

  14. Effects of hue, saturation, and brightness on preference: a study on Goethe's color circle with RGB color space

    Science.gov (United States)

    Camgoz, Nilgun; Yener, Cengiz

    2002-06-01

    In order to investigate preference responses for foreground- background color relationships, 85 university undergraduates in Ankara, Turkey, viewed 6 background colors (red, yellow, green, cyan, blue, and magenta) on which color squares of differing hues, saturations, and brightnesses were presented. All the background colors had maximum brightness (100%) and maximum saturation (100%). Subjects were asked to show the color square they preferred on the presented background color viewed through a computer monitor. The experimental setup consisted of a computer monitor located in a windowless room, illuminated with cove lighting. The findings of the experiment show that the brightness 100%- saturation 100% range is significantly preferred the most (p-value < 0.03). Thus, color squares that are most saturated and brightest are preferred on backgrounds of most saturated and brightest colors. Regardless of the background colors viewed, the subjects preferred blue the most (p-value < 0.01). Findings of the study are also discussed with pertinent research on the field. Through this analysis, an understanding of foreground-background color relationships in terms of preference is sought.

  15. Laboratory analysis of fluid flow and solute transport through a variably saturated fracture embedded in porous tuff

    International Nuclear Information System (INIS)

    Chuang, Y.; Haldeman, W.R.; Rasmussen, T.C.; Evans, D.D.

    1990-02-01

    Laboratory techniques are developed that allow concurrent measurement of unsaturated matrix hydraulic conductivity and fracture transmissivity of fractured rock blocks. Two Apache Leap tuff blocks with natural fractures were removed from near Superior, Arizona, shaped into rectangular prisms, and instrumented in the laboratory. Porous ceramic plates provided solution to block tops at regulated pressures. Infiltration tests were performed on both test blocks. Steady flow testing of the saturated first block provided estimates of matrix hydraulic conductivity and fracture transmissivity. Fifteen centimeters of suction applied to the second block top showed that fracture flow was minimal and matrix hydraulic conductivity was an order of magnitude less than the first block saturated matrix conductivity. Coated-wire ion-selective electrodes monitored aqueous chlorided breakthrough concentrations. Minute samples of tracer solution were collected with filter paper. The techniques worked well for studying transport behavior at near-saturated flow conditions and also appear to be promising for unsaturated conditions. Breakthrough curves in the fracture and matrix, and a concentration map of chloride concentrations within the fracture, suggest preferential flows paths in the fracture and substantial diffusion into the matrix. Average travel velocity, dispersion coefficient and longitudinal dispersivity in the fracture are obtained. 67 refs., 54 figs., 23 tabs

  16. Genetic high throughput screening in Retinitis Pigmentosa based on high resolution melting (HRM) analysis.

    Science.gov (United States)

    Anasagasti, Ander; Barandika, Olatz; Irigoyen, Cristina; Benitez, Bruno A; Cooper, Breanna; Cruchaga, Carlos; López de Munain, Adolfo; Ruiz-Ederra, Javier

    2013-11-01

    Retinitis Pigmentosa (RP) involves a group of genetically determined retinal diseases caused by a large number of mutations that result in rod photoreceptor cell death followed by gradual death of cone cells. Most cases of RP are monogenic, with more than 80 associated genes identified so far. The high number of genes and variants involved in RP, among other factors, is making the molecular characterization of RP a real challenge for many patients. Although HRM has been used for the analysis of isolated variants or single RP genes, as far as we are concerned, this is the first study that uses HRM analysis for a high-throughput screening of several RP genes. Our main goal was to test the suitability of HRM analysis as a genetic screening technique in RP, and to compare its performance with two of the most widely used NGS platforms, Illumina and PGM-Ion Torrent technologies. RP patients (n = 96) were clinically diagnosed at the Ophthalmology Department of Donostia University Hospital, Spain. We analyzed a total of 16 RP genes that meet the following inclusion criteria: 1) size: genes with transcripts of less than 4 kb; 2) number of exons: genes with up to 22 exons; and 3) prevalence: genes reported to account for, at least, 0.4% of total RP cases worldwide. For comparison purposes, RHO gene was also sequenced with Illumina (GAII; Illumina), Ion semiconductor technologies (PGM; Life Technologies) and Sanger sequencing (ABI 3130xl platform; Applied Biosystems). Detected variants were confirmed in all cases by Sanger sequencing and tested for co-segregation in the family of affected probands. We identified a total of 65 genetic variants, 15 of which (23%) were novel, in 49 out of 96 patients. Among them, 14 (4 novel) are probable disease-causing genetic variants in 7 RP genes, affecting 15 patients. Our HRM analysis-based study, proved to be a cost-effective and rapid method that provides an accurate identification of genetic RP variants. This approach is effective for

  17. The effect of rock electrical parameters on the calculation of reservoir saturation

    International Nuclear Information System (INIS)

    Li, Xiongyan; Qin, Ruibao; Liu, Chuncheng; Mao, Zhiqiang

    2013-01-01

    The error in calculating a reservoir saturation caused by the error in the cementation exponent, m, and the saturation exponent, n, should be analysed. In addition, the influence of m and n on the reservoir saturation should be discussed. Based on the Archie formula, the effect of variables m and n on the reservoir saturation is analysed, while the formula for the error in calculating the reservoir saturation, caused by the error in m and n, is deduced, and the main factors affecting the error in reservoir saturation are illustrated. According to the physical meaning of m and n, it can be interpreted that they are two independent parameters, i.e., there is no connection between m and n. When m and n have the same error, the impact of the variables on the calculation of the reservoir saturation should be compared. Therefore, when the errors of m and n are respectively equal to 0.2, 0.4 and 0.6, the distribution range of the errors in calculating the reservoir saturation is analysed. However, in most cases, the error of m and n is about 0.2. When the error of m is 0.2, the error in calculating the reservoir saturation ranges from 0% to 35%. Meanwhile, when the error in n is 0.2, the error in calculating the reservoir saturation is almost always below 5%. On the basis of loose sandstone, medium sandstone, tight sandstone, conglomerate, tuff, breccia, basalt, andesite, dacite and rhyolite, this paper first analyses the distribution range and change amplitude of m and n. Second, the impact of m and n on the calculation of reservoir saturation is elaborated upon. With regard to each lithology, the distribution range and change amplitude of m are greater than those of n. Therefore, compared with n, the effect of m on the reservoir saturation is stronger. The influence of m and n on the reservoir saturation is determined, and the error in calculating the reservoir saturation caused by the error of m and n is calculated. This is theoretically and practically significant for

  18. High-throughput epitope identification for snakebite antivenom

    DEFF Research Database (Denmark)

    Engmark, Mikael; De Masi, Federico; Laustsen, Andreas Hougaard

    Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individua...... toxins from pit vipers (Crotalidae) using the ICP Crotalidae antivenom. Due to an abundance of snake venom metalloproteinases and phospholipase A2s in the venoms used for production of the investigated antivenom, this study focuses on these toxin families.......Insight into the epitopic recognition pattern for polyclonal antivenoms is a strong tool for accurate prediction of antivenom cross-reactivity and provides a basis for design of novel antivenoms. In this work, a high-throughput approach was applied to characterize linear epitopes in 966 individual...

  19. Soil aquifer treatment of artificial wastewater under saturated conditions

    KAUST Repository

    Essandoh, H. M K; Tizaoui, Chedly; Mohamed, Mostafa H A; Amy, Gary L.; Brdjanovic, Damir

    2011-01-01

    A 2000 mm long saturated laboratory soil column was used to simulate soil aquifer treatment under saturated conditions to assess the removal of chemical and biochemical oxygen demand (COD and BOD), dissolved organic carbon (DOC), nitrogen

  20. Robust high-throughput batch screening method in 384-well format with optical in-line resin quantification.

    Science.gov (United States)

    Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen

    2015-04-15

    High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.